this post was submitted on 04 Feb 2025
29 points (93.9% liked)
Anime
2158 readers
73 users here now
This community is the place to discuss and ask questions about anime, anime news, and related topics.
Currently airing show discussion threads are created by our resident bot, [email protected]. If it doesn't make a thread for an episode that you want to discuss, see the user guide on the wiki for instructions on how to ask rikka to make a thread for you to use.
Check out our wiki to find:
Rules
More complete rules on the wiki.
- Posts must relate to anime or similar (donghua, etc.)
- Discussion threads for currently airing media must be made by the bot. If there isn't a discussion thread made for a piece of media you want to discuss, then request it via pm.
- Ensure that all series spoilers are tagged. See here for details.
- Memes should be directed toward [email protected] (or similar) instead of this community
- Do not post explicitly NSFW material. Please use your best judgement when marking lewd material as NSFW, unmarked material risks being removed.
- Please redirect discussion of piracy towards a more appropriate community
- Any clips from currently airing shows cannot include content from episodes released within the past 7 days.
- All posts and comments must adhere to the ani.social Terms of Use
- In general, keep things civil and avoid attacking other individuals.
Related General Communities
founded 2 years ago
MODERATORS
Uhh, oh, fair enough (゚∀゚)
Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.
Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.
I have a very beefy PC, so I don't think VRAM or any hardware will really be the limitation, thankfully. Thanks for the links!