I ran rmlint on a local folder to find just duplicate directories, using command
rmlint -vxgbe -T "dd"
It reported files and duplicates as follows:
In total 81982 files, whereof 18446744073709544613 are duplicates
in 8073 groups.
The number of duplicates is an astronomical product of the number of files.
Because I have little experience with rmlint, I'm leery of using it on a 5TB folder until I've ironed out the kinks.
Assistance appreciated !!
BTW, here's the full terminal session.
bvi1434:~ ericlindell$ cd /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy
bvi1434:depthPartialCopyFrCopy ericlindell$ rmlint -vxgbe -T "dd"
▕░░░░░▏ Traversing (81982 usable files / 0 + 0 ignored files / folders)
▕░░░░░▏ Preprocessing (reduces files to 60054 / found 0 other lint)
▕░░░░░▏Matching (44858 dupes of 15175 originals; 0 B to scan in 0 files, ETA:
▕░░░░░▏ Merging files into directories (stand by...)
==> In total 81982 files, whereof 18446744073709544613 are duplicates in 8073 groups.
==> This equals 74.87 MB of duplicates which could be removed.
==> Scanning took in total 26.128s.
Wrote a sh file to: /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy/rmlint.sh
Wrote a json file to: /Volumes/14tb-2nd/searchable/DDD-TO-DIST-EEEEEEEEEEEEEEEEE/archivePrbly/DepthFr-7p31/depthReduced/DepthEtc-depth41/depthPartialCopyFrCopy/rmlint.json
bvi1434:depthPartialCopyFrCopy ericlindell$