187
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 77 points 2 weeks ago
[-] [email protected] 12 points 2 weeks ago

I also self host, but I use OpenWebUI as a front end and ollama as a backend. Which one is this?

[-] [email protected] 7 points 2 weeks ago* (last edited 2 weeks ago)

Looks like Kobold. You can set it up as a shared LLM model server. Looks like Dark Champion model.

Edit: https://huggingface.co/DavidAU/Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF

[-] [email protected] 10 points 2 weeks ago

Huh, I tried that model in LMstudio and it's quite tame. Just asks me what I want to do with it.

[-] [email protected] 8 points 2 weeks ago

I’m dying. I love this so much.

[-] [email protected] 4 points 2 weeks ago

Self hosted hype guy

[-] [email protected] 4 points 2 weeks ago

Hahaha this is incredible. Can't wait to try stuff like this once I get my hands on more VRAM

[-] [email protected] 15 points 2 weeks ago

Yeah, that's the spirit, bro! VRAM in the butt = VRAM power!

[-] [email protected] 2 points 2 weeks ago

There’s no feeling quite like cumming with a bunch of vram inside of your urethra tho

this post was submitted on 31 May 2025
187 points (97.5% liked)

Artificial Intelligence

1664 readers
1 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 2 years ago
MODERATORS