this post was submitted on 10 Feb 2025
59 points (94.0% liked)
Australian Tech
117 readers
1 users here now
For techs and techy stuff.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can run deepseek-r1:8b (one of their reduced models) on a Raspberry Pi 5, though it is slow: https://www.tomshardware.com/raspberry-pi/how-to-run-deepseek-r1-on-your-raspberry-pi-5
I imagine any Win10 computer that anyone is still using at work would be considerably more powerful than an RPi5, and should be able to run r1:8b at a more comfortable speed, and possibly r1:14b.
You can find guides online for system requirements for the various models, though I wouldn't necessarily trust them, as they may have been targeting better speed of response than you care about. Once you've got ollama set up on your machine, it shouldn't be much hassle to just try a few of the smaller distilled models yourself and find which one which has a quality-to-speed ratio you're happy with.