this post was submitted on 11 Apr 2024
309 points (100.0% liked)
Technology
37717 readers
381 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah for sure. I run the server + a bunch of bridges (whatsapp, signal, telegram, chatgpt) on an old atom NUC with 8GB RAM and it only actually uses 2 GB.
Here's the documentation for the playbook: https://github.com/spantaleev/matrix-docker-ansible-deploy
I can really recommend it. It takes some reading to set it up because it's insanely configurable. But in the end I have a config file with like 20 statements in it and that sets it all up and keeps it up to date.
Just tried out that playbook to set up a staging server, and it works pretty well.
I feel like it's a bit too magical though. I like knowing how all the software I'm using is installed and configured, and introducing another layer of abstraction makes that harder. I have particular ways things like my web server (Nginx), database servers, Let's Encrypt (certbot), etc are configured and want to keep things that way. I think I'll just use the Ansible playbook for the staging server, and set up the real server using the Docker containers directly, based on documentation from the upstream projects (Synapse, etc)
It looks like they have both Docker containers and Debian packages avaliable, so I'll have to see if it's worth using the Debian packages instead.
That's true. They actually stopped supporting Nginx recently which really bothered me too because I want to keep using self-signed certs (my server is only reachable internally and I do not want to expose it to the internet). And the new server they use (I forgot which) didn't really have that option. So right now I'm locked out from updating until I fix that.
And yes it is totally feasible to use upstream! Not a problem at all.
I would recommend to use the dockers though, as the whole debian thing becomes a bit of a mess with different python requirements for some of the bridges. I tried that in a long forgotten past and there is a reason I'm trying to forget that 🤭
Like you I know the ansible playbook has its limits (for example one other thing I run into is that I want to run several instances of the same bridge to bridge eg. 2 whatsapp accounts!) but I do think docker is the way to go. I'm interested to hear how you're faring though as it's a long time ago since I tried that.
If you have your own domain name, you can get Let's Encrypt certificates for internal servers by using DNS challenges instead of HTTP challenges. I use subdomains like
whatever.int.example.com
for my internal systems.Of course, it's possible that the Ansible playbook doesn't support that...
Thanks for the note about Python and the Debian packages. That's a good point. I'll definitely use the Docker containers.