this post was submitted on 27 Apr 2025
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/Juaguel on 2025-04-26 12:41:26.

Run the code to automatically download all the images from a list of URL-links in a ".txt" file. Works for google books previews. It is a Windows 10 batch script, so save as ".bat".

@echo off
setlocal enabledelayedexpansion

rem Specify the path to the Notepad file containing URLs
set inputFile=
rem Specify the output directory for the downloaded image files
set outputDir=

rem Create the output directory if it doesn't exist
if not exist "%outputDir%" mkdir "%outputDir%"

rem Initialize cookies and counter
curl -c cookies.txt -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" "https://books.google.ca/" >nul 2>&1
set count=1

rem Read URLs from the input file line by line
for /f "usebackq delims=" %%A in ("%inputFile%") do (
    set url=%%A
    echo Downloading !url!
    curl -b cookies.txt -o "%outputDir%\image!count!.png" -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" "!url!" >nul 2>&1 || echo Failed to download !url!
    set /a count+=1
    timeout /t %random:~-1% >nul
)

echo Downloads complete!
pause

You must specify the input file of the URL-list, and specify the output folder for the downloaded images. Can use "copy as path".

URL-link list ".txt" file must contain only links, nothing else. Press "enter" to separate URL-links. To cancel the operation/process, press "Ctrl+C".

If somehow it doesn't work, you can always give it to an AI like ChatGPT to fix it up.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here