knaak

joined 1 year ago
[โ€“] [email protected] 1 points 11 months ago

I use GIT for my docker compose files and I have it setup as its own project, with top level subdirectory being the server that the docker container runs on. For me, its "docker-external" and "docker-internal" where i partition between containers I expose via Cloudflare and those that I do not.

Then on each server, I clone the repo into my home directory and then I create a symbolic link which is always called "docker" but links to the directory for each server.

That lets me manage my compose files nicely and push/pull to git.

I have gitea running on one of the containers with all of my repositories, including the docker-compose ones.

Then I have a VM that I run my scheduled jobs on, and I attached an external disk to that VM. Every day, I pull my gitea repos onto my external drive. Then I push from that into AWS CodeCommit.

That gives me automated backups for my code internally on each server and gitea, then internally on my external hdd, and finally externally in AWS which fits my 3-2-1 backup policy.

Then for my mounted volumes, i run Syncthing on each of my docker containers and then on that VM with the external disk. Then I have a bi-weekly job that syncs it to my NAS, then my NAS goes to Backblaze each week.

[โ€“] [email protected] 1 points 1 year ago

I set up a git repo in AWS Code Commit and then I setup a script on a backup VM that I have an external HDD mounted to. I have a cron job which pulls from gitea and then pushes to CodeCommit via a remote server in git.

That gives me a backup of git on the external hdd and externally via aws plus my gitea is backed up twice a week to my truenas.