43
submitted 4 days ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 14 points 3 days ago

I don’t think Yogthis is using a generative LLM produced by China on their computer.

Yes, that's exactly what I'm doing. I run DeepSeek and Qwen models using Ollama locally. They work great. I also use full DeepSeek online. It's absolutely bizarre that you would make this assumption without even asking. I also run Stable Diffusion models locally using https://github.com/AUTOMATIC1111/stable-diffusion-webui

I looked around; there are two of them, and neither seem to have their source published on an English-speaking website.

No you haven't because if you did then you'd quickly find plenty of Chinese models ready to use that are in English. I've linked a few in this comment https://lemmygrad.ml/post/8454753/6668311

You've literally done zero investigation before spewing nonsense here. It's incredible to see such low effort trolling on here.

[-] [email protected] -5 points 3 days ago* (last edited 3 days ago)

@yogthos Ollama is, if I recall, an American project sponsored by Meta. And Stable Diffusion, as already mentioned, is American too. So, after your very first post to me in here was proclaiming that you're using Chinese software, now you admit that you aren't.

In your post there, you have two text-to-image generation models. Again, one of them I'd found (and is the one that I could find nothing but broken links for), one of them I hadn't.

(Edit: Two that I had; I see Qwen is the one that Alibaba is released, so, my bad on reading comprehension)

I'm not lacking in research, you're lacking in honesty. And you accuse me of low effort trolling when you're literally using AI to manufacture replies to me? Dude.

[-] [email protected] 11 points 2 days ago

you're doing an awful lot of talking out of your ass in this thread

[-] [email protected] -4 points 2 days ago

@PoY Whatever. Maybe I am, if I'm the only one here who thinks it's obvious that none of this adds up. I'll bow out.

[-] [email protected] 12 points 3 days ago

Ollama is an open source project. The fact that this tech originates in the US does not mean it shouldn't be used. Most of the software and hardware you use is of US origin in one way or another. The discussion was about the models themselves, which are of course Chinese. You are shamefully ignorant on the subject you're attempting to discuss here.

The only one lacking in honesty here is you bud, and you ain't fooling anyone here.

[-] [email protected] 2 points 2 days ago

On another note Ollama is really shitty and they're trying to push their own API format, please consider using the one true community deriven gigachad Russian developed 'llama.cpp'. KoboldCPP even has a nice web UI for use.

[-] [email protected] 2 points 2 days ago

oh thanks for the heads up, I've heard of llama.cpp, but never bothered looking into taking it for a spin.

this post was submitted on 09 Jul 2025
43 points (84.1% liked)

Memes

4352 readers
38 users here now

Good memes, bad memes, unite towards a united front.

founded 5 years ago
MODERATORS