this post was submitted on 28 Jan 2025
10 points (85.7% liked)
Privacy
544 readers
334 users here now
Protect your privacy in the digital world
Welcome! This is a community for all those who are interested in protecting their privacy.
Rules
~PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!~
- Be nice, civil and no bigotry/prejudice
- No tankies/alt-right fascists. The former can be tolerated but the latter are banned
- Stay on topic
- Don't promote proprietary software
- No crypto
- No Xitter links (only allowed when can't fact check any other way, use xcancel)
- If you post news exclusive to a country please name it. ~(This isn't a bannable rule, just a recommendation!)~
Related communities
founded 2 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
From that thread, switching runtimes in LMStudio might help. On Windows the shortcut is apparently Ctrl+shift+R. There are three main kinds: Vulkan, CUDA, and CPU. Vulkan is an AMD thing; CUDA is an nVidia thing; and CPU is a backup to use when the other two aren't working for it is sssslllllooooooowwwwwww.
In the thread one of the posters said they got it running on CUDA, and I imagine that would work well for you since it's an nVidia chip; or, if it's already using CUDA try llama.cpp or Vulkan.
Yeah but for some reason it raises an error :(
Which runtimes did you try, specifically?
Cuda gives the error I told you before, vulkan works once and then it also stops working. I didn't try the CPU cause I thought it would be so slow and there is no point to it
Okay no worries, I'd at least try llama cpp just to see how fast it is and to verify it works. If it doesn't work or only works once and then quits, maybe the problem is LMStudio. In that case you might want to try GPT4All (https://www.nomic.ai/gpt4all); this is the one I started with way back in the day.
If you care enough to post the logs from LMStudio after it crashes I'm happy to take a look for you and see if I can see the issue, as well 🙌