this post was submitted on 16 Jun 2024
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/Glen_Garrett_Gayhart on 2024-06-16 03:26:24.

I've got a lot of urls like this: https://www.newgrounds.com/art/view/alvinhew/annika where one or more images are displayed.

I want to use wget to get the images on these pages that have links like this: https://art.ngfiles.com/images/49000/49087_alvinhew_annika.jpg?f1254528733 but I'm not sure how I should configure wget to go from the first sort of url to target the second sort.

I could just open all of the www.newgrounds urls and copy the art.ngfiles urls, but that would defeat the purpose of automating it. I want to download a lot of these, and I've got a batch file that will go through them all. How should I instruct wget to look at urls of the first www.newgrounds sort, and then download everything from urls of the second art.ngfiles sort?

I don't mind if I get some extra files, like thumbnails and things, but I don't want wget spidering all over the website and potentially downloading things from pages other than the art.ngfiles links.

`

Thanks in advance for any help!

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here