this post was submitted on 13 Aug 2023
10 points (91.7% liked)

LocalLLaMA

2884 readers
33 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
 

Been a lot of good work done the past week by several pivotal members, and now the boss is back and focused on it, going to be a very breaking change but I'm really excited where this will lead us!

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Kerfuffle 2 points 2 years ago (1 children)

which would mean going forward no more breaking changes when new formats are worked on

I definitely wouldn't count on that.

But it does make it much easier to do some changes, like adding/changing model specific fields which previously would have required a format change. Stuff like changing or dropping support for existing quantizations would also break stuff independent of the model format itself.

[โ€“] noneabove1182 2 points 2 years ago

oh yeah definitely didn't mean "no more breaking changes", just that we've had several from ggml file format changes, and so THAT portion of the breaking is going away