I have this simple html site which has indexed many "subdirectories" and files (mostly pdf, some images, zips and txt files)
https://preview.redd.it/c4w01zt1fs3f1.png?width=1053&format=png&auto=webp&s=534b9d83d4a7b0dd693948a5535759922e13f5e3
For some reason it seems that httrack fails quite often to complete many of the downloadable files, giving only partial downloads.
I am certain that some of the files won't download manually via browser but it's just maybe a couple dozen, which is nothing compared to the total (~8k files), the others can be downloaded just fine.
I tried by reducing the number of simultaneous connections and forced them to stay always open, doesn't seem to change anything...
Maybe I should try different software? I don't need to preserve the site, I just need every downloadable file downloaded in its subdirectory.
Any suggestion would be very appreciated