Just started properly getting into the rabbit hole that is datahoarding; for the past couple years I have only been storing my relevant files on external storage devices and using RapidCRC to generate CRC32 hashes for the files within each directory (such that I can pick up on any file corruption issues during quarterly checks of my data). While I have since decided to build myself a proper NAS to serve as my primary storage (with external storage and a cloud service serving as the backups), a couple questions still remain in regards to my current practices with external storage:
To my understanding, there are far better hashing algorithms than CRC32 but would they provide any tangible benefits over CRC32 solely from a data corruption perspective (e.g. a lower chance of hash remaining the same in the event of corruption, even if unlikely to begin with)?
(I'm probably overthinking this one, but) does it matter if I have one checksum file per directory (that contains all the file hashes in said directory) as opposed to an individual file per item?
Lastly, would there be a more efficient method of checking the directories (opening checksum files to verify file integrity) than doing so manually? I don't have much practical knowledge with running scripts and the like, but am willing to learn if necessary.
Thanks for reading and appreciate the help! :)