this post was submitted on 23 Nov 2023
2 points (66.7% liked)

Self-Hosted Main

504 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

Hi Everyone

I need some help

I'm currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.

I'm currently using Digital ocean's snapshots feature but is there a better way i could use, any help on this is highly appreciated.

top 33 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 11 months ago (1 children)

Uuuh...timeshift and borg??

[–] [email protected] 2 points 11 months ago

Hey that is the plot to First Contact.

[–] [email protected] 1 points 11 months ago

Proxmox Backup Server (PBS) snapshotting all my VM's / LXC's.

External VPS' and anything that can't run PBS-Client I am rsync'ing important data into my home network first, then doing a file based backup of that data to PBS via PBS-Client tool. All this is automated through cron jobs.

Those backups then get sync'd to a 2nd datastore for a bit of redundancy.

[–] [email protected] 1 points 11 months ago
[–] [email protected] 1 points 11 months ago

Unraid with Duplicacy and Appdata Backup incremental to Backblaze

[–] [email protected] 1 points 11 months ago

I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.

https://github.com/duplicati/duplicati

[–] [email protected] 1 points 11 months ago

Most of mine are lightweight so private repos on git.

For big data I have two NAS that sync on the daily.

[–] [email protected] 1 points 11 months ago (1 children)

Cronjobs to backup important folders to a separate disk

Git repo(s) for services & configs with weekly automated commits and pushes

[–] [email protected] 1 points 11 months ago

I do the reverse… all configs are ansible scripts and files and I just push them to the servers. That way I can spin up a new machine from scratch, completely automated within minutes… just the time it takes the machine to set itself up.

[–] [email protected] 1 points 11 months ago

As others said, use volume mounts, and I incrementally backup those with borg to minimize storage space requirements

[–] [email protected] 1 points 11 months ago

I use rdiff-backup to backup the volumes directory of my VPS to a local machine via VPN. Containers are stored in some public registry anyways. Also use ansible with all the configurations and container settings.

[–] [email protected] 1 points 11 months ago

I use Kopia. The cli is very easy to use and I have backups scheduled nightly. I backup all external mounts and my entire Portainer directory. Has helped in a pinch to restore busted databases.

I point Kopia cli to backup to a WebDAV location I host on my NAS. For off-site backups I run daily backups of that kopia repository to Google cloud.

I'm not sure if Google cloud is the best off-site backup solution, but I did a price comparison when I first selected it and it was the best capacity for the price that I could find at the time.

[–] [email protected] 1 points 11 months ago

Almost everything I run is a Docker container, so I made /var/lib/docker a btrfs subvolume, and then I make every day incremental snapshots (cron job) and copy them to a secondary disk (also btrfs, using btrbk). Since they are btrfs snapshots they don't use a lot of disk space and if I really need to rollback an entire day I can

[–] [email protected] 1 points 11 months ago

Haven't used digital ocean but I run 2 Proxmox servers and 2 NAS one of each at a different location.

I backup containers and VMs which run in Proxmox to the NAS via NFS and then have a nightly script to copy the backups from there to my remote NAS. It works, haven't lost any data yet. Still thinking about a third backup in another location as well but money is a thing 🤷

[–] [email protected] 1 points 11 months ago (1 children)

Borgbackup, borgmatic to two backup targets: one in my home and a Hetzner Storage Box. Amongst other things, i include /var/lib/docker/volumes, covering the not-filesystem-bound mounts.

[–] [email protected] 1 points 11 months ago (1 children)

What retention do you run?
I'm setting up the same system, but don't know how far back I need. Currently considering 7 daily backups, so I can restore to any point within the week, and 2-3 Monthly backups in case there's an issue I miss for a real long period.

[–] [email protected] 1 points 11 months ago

entirely up to your feelings i guess - i run 7 dailies and 2 weeklies

[–] [email protected] 1 points 11 months ago

Use resticker to add an additional backup service to each compose allowing me to customize some pre/post backup actions. Works like a charm 👍

[–] [email protected] 1 points 11 months ago

Borg Backup to Hetzner Storage Box.

[–] [email protected] 1 points 11 months ago

A few hard drives that are stored offsite and rotate every few weeks.

[–] [email protected] 1 points 11 months ago

I just run a pg_dump through kubectl exec and pipe the stdout to a file on my master node. The same script then runs restic to send encrypted backups over to s3. I use the host name flag on the restic command as kind of a hack to get backups per service name. This eliminates the risk of overwriting files or directories with the same name.

[–] [email protected] 1 points 11 months ago

I backup all the mounted docker volumes once every hour (snapshots). Additionally i create dumps from all databases with https://github.com/tiredofit/docker-db-backup (once every hour or once a day depending on the database).

[–] [email protected] 1 points 11 months ago

ZFS snapshots.

[–] [email protected] 1 points 11 months ago

duplicati to take live, crash-consistent backups of all my windows servers and VMs with Volume Shadowcopy Service (VSS)

[–] [email protected] 1 points 11 months ago (1 children)

When backing up Docker volumes, should not the docker container be stopped first.?? I can't se any support for that in the backup tools mentioned.

[–] [email protected] 1 points 11 months ago (1 children)

Yes the containers do need to be stopped. I actually built a project that does exactly that.

[–] [email protected] 1 points 11 months ago

Thanks,, I will look into this.

[–] [email protected] 1 points 11 months ago

I use Nautical. It will stop your containers before performing an RSYNC backup.

[–] [email protected] 1 points 11 months ago

I have bind mounts to nfs shares that are backed my zfs pools and last snapshots and sync jobs to another storage device. All containers are ephemeral.

[–] [email protected] 1 points 11 months ago

On Proxmox i use for my Backup Solution - Hetzner Storage Bix

[–] [email protected] 1 points 11 months ago (1 children)

I use docker in Proxmox and i backup all container

[–] [email protected] 1 points 11 months ago

I use an Ubuntu vm for all my containers in proxmox and make backups of the vm onto my zfs pool

[–] [email protected] 1 points 11 months ago

For databases and data I use restic-compose-backup because you can use labels in your docker compose files.

For config files I use a git repository.