The original post: /r/datahoarder by /u/helix400 on 2024-07-25 03:24:19.
Original Title: I'm desiring a friendly daily offsite backup solution for terabytes of data that retains all file versions and prevents overwrites or deletions. Seems the only self-hosted way to get there is pull backups, append-only push, or push to ZFS?
Lately I've had a few bad experiences accidentally overwriting files and backing up those overwrites. I've also seen others hit by ransomeware attacks. Now I just want off-site backups where I can feel at ease.
The main goals I'm trying to achieve:
-
- Always retain old file versions. Suppose someone (attacker, family member, me) ovewrites key files, and I don't discover it until two years later. I want three year old backups of it.
-
- Keep attackers from being able to delete backup files. Suppose an attacker compromises a computer, I don't want that attacker to have access to username/password credentials to simply delete the remote backup files.
-
- A GUI that allows for friendly restoration. I'm perfectly capable of consoles, but sometimes GUIs are just easier.
-
- Provides emails of backup jobs. Life gets busy and email digests are the only thing that has consistently worked for me.
-
- Nightly backups of family computers. (So they can copy phone photos over and not worry about backups.)
I've also got two Linux servers with enough storage, one at my house and one at a family member's house.
I've been in the weeds on all this. Both #1 and #2 are tricky. Many backup solutions simply ignore #2 because their model is push backups (the computer logs into some remote, then copies files to the backup). In order to push you need credentials to read/write/delete, so a compromised machine has easy access to delete backups. I've seen push model fixes for this: A) make the remote only capable of read or append, and not overwrite or delete. Or B) use ZFS on the remote and have some kind of post-backup script or daily job that simply creates ZFS snapshots. Now backups can't get deleted or overwritten. Problem here is I've personally found append-only filesystems to just not play well with push backups, and ZFS is one more complexity that I could goof up.
I've got experience with rsync, Bacula, CrashPlan, and Duplicati. Also been looking at Borgbackup, Urbackup, Kopia, Bareos, Restic, and Duplicity. But the details get so overwhelming. I think I've got it down to two options for me:
Option #1 - Bacula
Pros:
- Pull based
- Tons of options
- Proven for 24 years. Active community.
- Various tools to assist, such as bacula-web to read reports, baculum to manage, or other tools like Bacularis as a slick GUI (I'm surprised how good this tool looks with so little discussion about it.)
- Emails for individual backups. Digest email scripts and tools out there like bacula backup report
Cons:
- Bacula scales to enterprise, so interface isn't simple for home backups
- Significant learning curve (but once learned for your needs you're set)
- Seems to require installing multiple tools to get it working as I desire
- Feels old fashioned, especially with all the tape nomenclature
Option #2 - Duplicati
Pros:
- Friendly GUI. Very easy to select backups, configure, and restore
- Has digest emails through www.duplicati-monitoring.com
Cons:
- Dev team is feeling barebones. For example, their Linux build doesn't even work on the latest LTS Ubuntu.
- Securing remote backups requires something like ZFS and a remote script
- Duplicati is infamous for occasional problems with indexing, and I've experienced it too. Rebuilding database indexes can take days and the UI for it is weak.
- Requires saving the configuration, so if you lose the client you can't restore properly unless you know how you configured your options.
I'm just curious what others think about all this. Is there some easier tool out there specifically designed for secure remote backups that makes this process easier?