this post was submitted on 04 Feb 2025
29 points (93.9% liked)

Anime

2158 readers
73 users here now

This community is the place to discuss and ask questions about anime, anime news, and related topics.

Currently airing show discussion threads are created by our resident bot, [email protected]. If it doesn't make a thread for an episode that you want to discuss, see the user guide on the wiki for instructions on how to ask rikka to make a thread for you to use.

Check out our wiki to find:

Rules

More complete rules on the wiki.

Related General Communities

rikka

founded 2 years ago
MODERATORS
 

Premieres sometime in 2025. Check out the ANN article for additional information. Synopsis of the source manga from AniList:

She will captivate anyone she sucks blood from!

Runa Ishikawa is a vampire. The cool and mysterious beauty is the most popular girl in class! But it turns out she's not very good at sucking blood? A new comedy about the pampering and feeding of a vampire!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 day ago (1 children)

Name...absolutely does not check out.

Uhh, oh, fair enough (゚∀゚)

Saliently enough, have you managed to try DeepSeek, or even get it set up locally?

Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.

Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.

[–] [email protected] 2 points 1 day ago

I have a very beefy PC, so I don't think VRAM or any hardware will really be the limitation, thankfully. Thanks for the links!