this post was submitted on 12 Dec 2024
9 points (80.0% liked)

LocalLLaMA

2292 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
9
Fixed it (sh.itjust.works)
submitted 5 days ago* (last edited 5 days ago) by HumanPerson to c/localllama
 

Seriously though, does anyone know how to use openwebui with the new version?

Edit: if you go into the ollama container using sudo docker exec -it bash, then you can pull models with ollama pull llama3.1:8b for example and have it.

top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 4 days ago

For some reasons, there are now two models settings pages. One in the workspace, and another one in the admin settings (the old one was moved here). The feature you are looking for was probably just moved in the admin settings page