this post was submitted on 07 Jul 2024
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/krpt on 2024-07-07 08:02:56.

Hi,

I'm looking to mirror this website with this kind of tree/files : https://maps.refuges.info/hiking/{$z}/{$x}/{$y}.png

It fails frequently to answer the network requests so I should be able to skip / pause when 'down' or 'at capacity'.

I wish to recreate the same directory structure and files locally.

I don't want to overload the server so I should throttle my speed too.

Best tool to do so ? Simple wget ( can it handle resumes without download what's already done ? ) ?

Thanks !

Edit : just realized the website doesn't permit listing files under the folders to avoid crawling, I have to find a way to enumerate all the structures / files before being able to download then.. no trick to list them ?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here