this post was submitted on 18 Jul 2023
11 points (100.0% liked)
LocalLLaMA
2878 readers
39 users here now
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They mention the possibility of parallelization in training. Is this something that could allow (or lead to allowing) distributed training? Something like Folding@Home for LLMs?
If so, I'm beyond excited. I honestly think that'll be a major step forward in the democratization of AI, if we can crowdsource training.
That's definitely a nifty idea, we've got people getting distributed inferencing, I can't see why we couldn't do something similar for training, especially if we learn better ways to combine training samples