this post was submitted on 08 Dec 2023
14 points (100.0% liked)

LocalLLaMA

2839 readers
17 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
 

Early speculation is that it's an MoE (mixture of experts) of 8 7b models, so maybe not earth shattering like their last release but highly intriguing, will update with more info as it comes out

top 2 comments
sorted by: hot top controversial new old
[โ€“] [email protected] 9 points 1 year ago (1 children)

Honestly its such a good idea to share models via p2p it saves so much bandwidth Ofc there should still be a ddl for preservation but still

[โ€“] noneabove1182 3 points 1 year ago

The only concern I had was my god is it a lot of faith to put in this random twitter, hope they never get hacked lol, but otherwise yes it's a wonderful idea, would be a good feature for huggingface to speed up downloads/uploads