2
submitted 1 day ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 4 points 1 day ago

Nah I mean, I was hoping it would be fully self hosted and offline, but I guess that would require you to run the models yourself

[-] [email protected] 1 points 1 day ago

Yeah, I think this is part of their current business model, unfortunately.

I mean, it wouldn't be hard to implement OpenAI compatible APIs. That would access to pretty much any AI service, including self hosted like Ollama.

But this would make their service unnecessary.

this post was submitted on 15 Jun 2025
2 points (60.0% liked)

Self Hosted

83 readers
1 users here now

founded 3 months ago
MODERATORS