The original post: /r/datahoarder by /u/JustAguy7081 on 2024-09-26 15:04:37.
Hi all
I'm in a bit of a pickle and are hoping someone in this group can help. I am running MX Linux, and the backup tools I use (Lucky Backup) creates a versioned copy of the directories they back up. This is perfect for my local backup needs as new backups only backup the newly changed files while retaining previous versions. The downside is that I am mirroring my entire group of backup directories to Mega, and it's not happy with almost 2 million backup files. I'm looking for a solution that reduces the 2 million files in my cloud backup, yet doesn't require a daily re upload of large backup containers because only a couple of files have changed. I do have my backups/mirroring split out into separate folders somewhat. Ie, documents (daily changes) Media (infrequent changes), software (infrequent changes) Photos (sometimes daily changed, sometimes static for a month), etc.
Ideally what I am thinking is to create a (new) tarball of each backed up folder if and only if it has new backup files in it, and then mirror these tarballs to Mega instead of the folders. The problem is that the backups create new files (ie logs) in each backup folders, even if there are no actual file changes to back up. So each backup folder is actually changing daily, independent of any actual file changes. I could possible reduce my backup frequency from daily to weekly or such, to help with this. Unfortunately then if files do change I have no backup of them until the now less frequent backup runs.
Changing from Mega is not an option for several reasons, so I need to find a solution that works with it.
Anyone else overcome a similar issue, or any ideas?