3
Idea: Selfhosted Interface for LLMs
(lemmy.blahaj.zone)
This is already a thing, there are a myriad of LLM chat interfaces where you can either connecting to models you are locally running or connect to APIs of providers. "Open WebUI", "librechat" and "Big-AGI" are web interfaces. On desktop you have things like jan.ai and a a lot more.
Your goal is antithetical since even without tracking the LLM is still absorbing everything you say to it.
If you're concerned, just run a local LLM and skip the web service one.
If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.