14
submitted 7 months ago by noneabove1182 to c/localllama

Early speculation is that it's an MoE (mixture of experts) of 8 7b models, so maybe not earth shattering like their last release but highly intriguing, will update with more info as it comes out

top 2 comments
sorted by: hot top controversial new old
[-] [email protected] 9 points 7 months ago

Honestly its such a good idea to share models via p2p it saves so much bandwidth Ofc there should still be a ddl for preservation but still

[-] noneabove1182 3 points 7 months ago

The only concern I had was my god is it a lot of faith to put in this random twitter, hope they never get hacked lol, but otherwise yes it's a wonderful idea, would be a good feature for huggingface to speed up downloads/uploads

this post was submitted on 08 Dec 2023
14 points (100.0% liked)

LocalLLaMA

2118 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS