this post was submitted on 24 Jun 2025
25 points (79.1% liked)
Ollama - Local LLMs for everyone!
187 readers
1 users here now
A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.
founded 5 days ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've been experimenting with it for different use cases:
I only have a GeForce 1080 Ti in it, so some projects are a bit slow and I don't have the biggest models, but what really matters is the self-satisfaction I get by not using somebody else's model, or that's what I try to tell myself while I'm waiting for responses.