this post was submitted on 29 Jan 2024
63 points (90.9% liked)

LocalLLaMA

2268 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 9 months ago (1 children)

Hugging face have an llm plug-in for code completion in neovim btw!

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

Oh nice! Got a link for anyone that comes across this? Save me and others a search plz?

EDIT: NM. Got it. Gonna give it a try later.

LLM powered development for Neovim

[–] [email protected] 2 points 9 months ago (1 children)

If you use ollama you can try to use the fork that I am using. This is my config to make it work: https://github.com/Amzd/nvim.config/blob/main/lua/plugins/llm.lua

[–] [email protected] 0 points 9 months ago

Nice. Thanks. I'll save this post in case I use ollama in the future. Right now I use a codellama model and a mythomax model, but am not running them via a localhost server, just outputted in the terminal or LMStudio.

This looks interesting though. Thanks!