this post was submitted on 19 Jun 2023
10 points (100.0% liked)
LocalLLaMA
2293 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah no problem! First issue however is that apple silicon plays kinda funny with this kind of setup, so I may need to make you a custom image to use. Otherwise you should have no problem running the -cpu build.
As for running the image itself, you can either run it form the command line with docker run, or you can make yourself a docker compose file
I personally tend to go the latter, and for that you can copy my docker-compose.yml file from here: https://hub.docker.com/r/noneabove1182/text-gen-ui-cpu
I'll work on making a mac-specific image and you can test it for me ;)