If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.
this post was submitted on 30 Jan 2024
71 points (94.9% liked)
ChatGPT
8909 readers
1 users here now
Unofficial ChatGPT community to discuss anything ChatGPT
founded 1 year ago
MODERATORS
why would you even share your passwords with ChatGPT?