this post was submitted on 24 Nov 2024
16 points (94.4% liked)
LocalLLaMA
2292 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have found the problem with the cut off, by default aider only sends 2048 tokens to ollama, this is why i have not noticed it anywhere else except for coding.
When running
/tokens
in aider:Even though it will only send 2048 tokens to ollama.
To fix it i needed to add a file
.aider.model.settings.yml
to the repository:That's because ollama's default max ctx is 2048, as far as I know.