111
submitted 4 days ago* (last edited 4 days ago) by [email protected] to c/[email protected]

I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 56 points 4 days ago* (last edited 4 days ago)

TBH you should fold this into localllama? Or open source AI?

I have very mixed (mostly bad) feelings on ollama. In a nutshell, they're kinda Twitter attention grabbers that give zero credit/contribution to the underlying framework (llama.cpp). And that's just the tip of the iceberg, they've made lots of controversial moves, and it seems like they're headed for commercial enshittification.

They're... slimy.

They like to pretend they're the only way to run local LLMs and blot out any other discussion, which is why I feel kinda bad about a dedicated ollama community.

It's also a highly suboptimal way for most people to run LLMs, especially if you're willing to tweak.

I would always recommend Kobold.cpp, tabbyAPI, ik_llama.cpp, Aphrodite, LM Studio, the llama.cpp server, sglang, the AMD lemonade server, any number of backends over them. Literally anything but ollama.


...TL;DR I don't the the idea of focusing on ollama at the expense of other backends. Running LLMs locally should be the community, not ollama specifically.

[-] [email protected] 3 points 1 day ago

Thanks for Lemonade hint. For Ryzen AI: https://github.com/lemonade-sdk/lemonade (linux=cpu for now)

[-] [email protected] 1 points 1 day ago

You can still use the IGP, which might be faster in some cases.

load more comments (24 replies)
this post was submitted on 24 Jun 2025
111 points (86.8% liked)

Selfhosted

46672 readers
1009 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS