This is an automated archive.
The original was posted on /r/datahoarder by /u/LAMGE2 on 2023-09-25 11:34:14+00:00.
Original Title: I want to archive 45 GB of approx. 1600 files to internet archive but both the script and the html uploader are slowing me down too much. My solution was no compression zipping them because IA can display the contents and allow individual downloads but I want to still get recommendations from you.
Hello. As the title says, I want to preserve approx. 1600 files totaling about 45 GB. The amount of files is the problem. The html uploader is slow and likes to fail very often. I kept pushing it a few months ago and when I decided it was not fun anymore, I gave up and decided to start over by deleting all files. I kept retrying with a macro btw, and it was still unbearable.
Deleting files was infinitely worse. IA python script always got ratelimited. Eventually I knew it wasn't gonna do it in time and I decided to delete all the files one by one (using macro because I wanted to stay sane).
I also deleted the derived files and metadata files, hope they regenerate in the end.
Eventually, I realized archive.org probably hates small files and there is no sane way to upload them? I don't know. I knew archive.org can display contents of a zip file and I decided to upload my files as zip with no compression. But before I do that, I want to make sure if I am doing it right.
1 - Did I come up with the best solution? If possible, I still don't want to go with zip. I want to upload all files by themselves even though archive.org hates me for that. But I don't want to spend 19238120 hours macroing a retry button or getting ratelimited.
2 - I plan to zip using 7zip with no compression. Any computer (windows 7, 8, 10, linux, mac) can extract without 7zip or external tools right? Since zip is like, the ultra most known format ever.