this post was submitted on 21 Feb 2025
70 points (88.9% liked)

Cybersecurity

6368 readers
61 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities [email protected] [email protected] [email protected] [email protected] [email protected]

Notable mention to [email protected]

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jwiggler 1 points 1 day ago (2 children)

Have you got any recs? I've got a 3080 in my machine atm

[–] [email protected] 2 points 1 day ago

I'm not @[email protected] However here's a pretty barebones how to article to get you started. Just know it can be as complicated as you like. For starters you may want to stick to the 7b and 14b models like mistral:7b and phi4:14b as they'll fit easily on your card and will allow you to test the waters.

If you're on Windows https://doncharisma.org/2024/11/23/self-hosting-ollama-with-open-webui-on-windows-a-step-by-step-guide/

If you're using Linux https://linuxtldr.com/setup-ollama-and-open-webui-on-linux/

If you want a container https://github.com/open-webui/open-webui/blob/main/docker-compose.yaml

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

Locally? Arcee 14B and the 14B Deepseek distill are currently the best models that fill fit.

I'd recommend hosting them with TabbyAPI instead of ollama, as they will be much faster and more VRAM efficient. But this is more fuss.

Honestly, I would just try free APIs like Gemini, Groq, and such through open web ui, or use really cheap APIs like openrouter. Newer 14B models are okay, but they're definitely lacking that "encyclopedic intelligence" larger models have.