Nah I mean, I was hoping it would be fully self hosted and offline, but I guess that would require you to run the models yourself
Yeah, I think this is part of their current business model, unfortunately.
I mean, it wouldn't be hard to implement OpenAI compatible APIs. That would access to pretty much any AI service, including self hosted like Ollama.
But this would make their service unnecessary.
Nah I mean, I was hoping it would be fully self hosted and offline, but I guess that would require you to run the models yourself
Yeah, I think this is part of their current business model, unfortunately.
I mean, it wouldn't be hard to implement OpenAI compatible APIs. That would access to pretty much any AI service, including self hosted like Ollama.
But this would make their service unnecessary.