this post was submitted on 31 May 2025
182 points (97.9% liked)

Artificial Intelligence

1632 readers
73 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 2 years ago
MODERATORS
182
submitted 1 week ago* (last edited 1 week ago) by Goten@piefed.social to c/ai_@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[–] MyNamesTotallyRobert@lemmynsfw.com 77 points 1 week ago (5 children)
[–] bulwark@lemmy.world 12 points 1 week ago (1 children)

I also self host, but I use OpenWebUI as a front end and ollama as a backend. Which one is this?

[–] ikidd@lemmy.world 7 points 1 week ago* (last edited 1 week ago)

Looks like Kobold. You can set it up as a shared LLM model server. Looks like Dark Champion model.

Edit: https://huggingface.co/DavidAU/Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF

[–] ikidd@lemmy.world 10 points 1 week ago

Huh, I tried that model in LMstudio and it's quite tame. Just asks me what I want to do with it.

[–] trungulox@lemm.ee 8 points 1 week ago

I’m dying. I love this so much.

[–] recklessengagement@lemmy.world 4 points 1 week ago (1 children)

Hahaha this is incredible. Can't wait to try stuff like this once I get my hands on more VRAM

[–] moody@lemmings.world 15 points 1 week ago (1 children)

Yeah, that's the spirit, bro! VRAM in the butt = VRAM power!

[–] trungulox@lemm.ee 2 points 1 week ago

There’s no feeling quite like cumming with a bunch of vram inside of your urethra tho

Self hosted hype guy