this post was submitted on 31 May 2025
188 points (97.5% liked)

Artificial Intelligence

1663 readers
4 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 2 years ago
MODERATORS
188
submitted 3 weeks ago* (last edited 3 weeks ago) by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 78 points 3 weeks ago (5 children)
[–] [email protected] 12 points 3 weeks ago (1 children)

I also self host, but I use OpenWebUI as a front end and ollama as a backend. Which one is this?

[–] [email protected] 7 points 3 weeks ago* (last edited 3 weeks ago)

Looks like Kobold. You can set it up as a shared LLM model server. Looks like Dark Champion model.

Edit: https://huggingface.co/DavidAU/Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF

[–] [email protected] 10 points 3 weeks ago

Huh, I tried that model in LMstudio and it's quite tame. Just asks me what I want to do with it.

[–] [email protected] 8 points 3 weeks ago

I’m dying. I love this so much.

[–] [email protected] 4 points 3 weeks ago (1 children)

Hahaha this is incredible. Can't wait to try stuff like this once I get my hands on more VRAM

[–] [email protected] 15 points 3 weeks ago (1 children)

Yeah, that's the spirit, bro! VRAM in the butt = VRAM power!

[–] [email protected] 2 points 3 weeks ago

There’s no feeling quite like cumming with a bunch of vram inside of your urethra tho

[–] [email protected] 4 points 3 weeks ago

Self hosted hype guy