this post was submitted on 28 Apr 2025
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/SameUsernameOnReddit on 2025-04-27 15:32:14.

Still very new to and not very good at this, need help with two issues using wget so far:

  1. Using wget -m -k (am I crazy for thinking wget -mk would work the same, by the way?) to archive blogs and any files they're hosting, especially videos and PDFs. I like the feature yt-dlp has with --download-archive archive.txt, and I'm wondering if wget has a feature like that, to make updating the archive with new posts easier. Or maybe it already works like that, and I'm slow. Not sure.
  2. Been trying to use this method to download everything a user has uploaded. Last time I tried this was last year, and it left 100+ files undownloaded. Now, this was a while ago, to the point that my terminal's history doesn't have the actual commands I used anymore. Still 99% sure I did everything by the book, so if anyone has experience with this, I'd appreciate it. Thinking of using the Internet Archive's CLI tool for this, still looking into whether it works like that, though.
no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here