this post was submitted on 25 May 2024
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/Darwinmate on 2024-05-25 13:25:58.

Several times a week machine(s) generate between 20 to 500gig of data that needs to be transferred to a large NAS. Currently I have an rsync script that runs continuosly monitoring for changes looped in a bash script with --partial to pick up where it left off.

However this is more of a hack. rsync is more a syrnconisation tool than a dedicated network transfer tool. So while it works great, it has issues resulting from frequent dropouts. It works okay but I'm after something more robust that can withtstand network dropouts.

To complicate matters, the NAS is running (ReFS)[https://en.wikipedia.org/wiki/ReFS], the machines which generate the data run window10, others run weird Centos DE and finally the last one is ubuntu. All transfer to the ReFS. Ideally the tool would work on Windows/Linux/MacOS.

I have only one contender: https://github.com/fast-data-transfer/fdt

Speed is not priority but stability and tracbility is.

Any recommendations?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here