this post was submitted on 10 May 2025
151 points (98.7% liked)

Selfhosted

46672 readers
718 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I'm planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I'd like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I'm more trying to go for a 2-1 with 2 copies and one offsite, but that's besides the point. Now I'm wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it's gonna be a few TB worth of HDDs which aren't exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it's uploaded?

While I'd preferably upload the data to a provider I trust, accidents happen, and since they don't need to access the data, I'd prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there's the offsite storage provider. Personally I'd appreciate as many suggestions as possible, as there is of course no one size fits all, so if you've got good experiences with any, please do send their names. I'm basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I'd like to be able to get the data off there after I've replaced my drives. That's all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it's mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn't do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 1 day ago (2 children)

There's some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

[–] [email protected] 2 points 17 hours ago (2 children)

How does one realistically test their backups, if they are doing the 3-2-1 backup plan?

I validate (or whatever the term used is) my backups, once a month, and trust that it means something 😰

[–] [email protected] 3 points 14 hours ago (1 children)

Untill you test a backup it's not complete, how you test it is up to you.

If you upload to a remote location, pull it down and unpack it. Check that you can open import files, if you can't open it then the backup is not worth the dick space

[–] [email protected] 3 points 15 hours ago (1 children)

Deploy the backup (or some part of it) to a test system. If it can boot or you can get the files back, they work.

[–] [email protected] 1 points 8 hours ago

For context, I have a single Synology NAS, so recovering and testing the entire backup set would not be practical in my case.

I have been able to test single files or entire folders and they work fine, but obviously I'd have no way of testing the entire backup set due to the above consideration. It is my understanding that the verify feature that Synology uses is to ensure that there's no bit rot and that the file integrity is intact. My hope is that because of how many isolated backups I do keep, the chance of not being able to recover is slim to none.

[–] [email protected] 5 points 22 hours ago (2 children)

Is there some good automated way of doing that? What would it look like, something that compares hashes?

[–] [email protected] 2 points 8 hours ago* (last edited 8 hours ago)

I don't trust automation for restoring from backup, so I keep the restoration process extremely simple:

  1. automate recreating services - have my podman files in a repository
  2. manually download and extract data to a standard location
  3. restart everything and verify that each service works properly

Do that once/year in a VM or something and you should be good. If things are simple enough, it shouldn't take long (well under an hour).

[–] [email protected] 3 points 14 hours ago

That very much depends on your backup of choice, that's also the point. How do you recover your backup?

Start with a manual recover a backup and unpack it, check import files open. Write down all the steps you did, how do you automate them.