this post was submitted on 31 Jul 2024
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/ericlindellnyc on 2024-07-30 21:47:46.

I ran rmlint on a local folder to find just duplicate directories, using command rmlint -vxgbe -T "dd" It reported files and duplicates as follows: In total 81982 files, whereof 18446744073709544613 are duplicates in 8073 groups.

The number of duplicates is an astronomical product of the number of files.

Because I have little experience with rmlint, I'm leery of using it on a 5TB folder until I've ironed out the kinks.

Assistance appreciated !!

BTW, here's the full terminal session.

bvi1434:~ ericlindell$ cd /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy bvi1434:depthPartialCopyFrCopy ericlindell$ rmlint -vxgbe -T "dd" ▕░░░░░▏ Traversing (81982 usable files / 0 + 0 ignored files / folders) ▕░░░░░▏ Preprocessing (reduces files to 60054 / found 0 other lint) ▕░░░░░▏Matching (44858 dupes of 15175 originals; 0 B to scan in 0 files, ETA: ▕░░░░░▏ Merging files into directories (stand by...)

==> In total 81982 files, whereof 18446744073709544613 are duplicates in 8073 groups. ==> This equals 74.87 MB of duplicates which could be removed. ==> Scanning took in total 26.128s.

Wrote a sh file to: /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy/rmlint.sh Wrote a json file to: /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy/rmlint.json bvi1434:depthPartialCopyFrCopy ericlindell$

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here