this post was submitted on 14 Dec 2023
14 points (100.0% liked)
LocalLLaMA
2265 readers
10 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
TL;DR yeah, it's doable, just slow.
You can train without a GPU, it just takes longer. More RAM and a better CPU will help up to a point. I don't think text generation is a particularly difficult task- you could probably do it with something like a Markov chain rather than an LLM if you don't care whether it's particularly coherent.
Well, I use my laptop as a daily-driver, so training an AI in the background, even when I don't use it seems a bit complicated. The Markov chain seems like an interesting alternative for what I'm looking, does any tools to use one exist or should I build one from scratch?
There are libraries that can do it. Here's one: https://pypi.org/project/PyDTMC/