Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
I. Build a PC for video editing because it was becoming impossible to do on the laptop. I realized that I can use the GPU also to run large language models myself.
So this week I've been setting up ollama and Open WebUI to be able to move some of my queries I ask ChatGPT and ask them on my computer, even if I'm away.
This way I don't need to send sensitive data to the USA and China. It works quite well but I only can use smaller models up to 14B because of the 12 GB VRAM my graphics card only has.