this post was submitted on 20 Nov 2023
-1 points (33.3% liked)

Self-Hosted Main

502 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

What is currently the leading chatbot that can be self hosted on your computer?

I have heard about alot of them and it seems like everyone and there dog is making one however I'm not sure which one I should be use.

Edit : I am running a 3060 with 12GB of ram

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 10 months ago (2 children)

I think the most advanced OpenSource LLM model right now is considered to be Mistral 7B Open Orca. You can serve it via the Oobabooga GUI (which let's you try other LLM models as well). If you don't have a GPU for interference, this will be nothing like the ChatGPT experience though but much slower.

https://github.com/oobabooga/text-generation-webui

You can also try these models on your desktop using GPT4all, which doesn't support GPU ATM.

https://gpt4all.io/index.html

[–] [email protected] 1 points 10 months ago

Thanks for the write up- will the coral usb work for interference?

[–] [email protected] 1 points 10 months ago

Mistral OpenOrca is a good one. I pull about 10 to 11 tokens/sec. Very impressive. For some reason though, i cannot get GPT4ALL to use my 2080ti even though it is selected in the settings.