this post was submitted on 20 Nov 2023
-1 points (33.3% liked)

Self-Hosted Main

511 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

What is currently the leading chatbot that can be self hosted on your computer?

I have heard about alot of them and it seems like everyone and there dog is making one however I'm not sure which one I should be use.

Edit : I am running a 3060 with 12GB of ram

top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 11 months ago

I haven't tried running any myself so my knowledge is just from having glanced at a few discussions in AI communities when it's come up, but I think Mistral 7B might be the current best, or a fine-tune of it such as Mistral 7B OpenOrca or Mistral 7B OpenHermes.

[–] [email protected] 1 points 11 months ago

We have also made a ChatGPT alternative optimized for self-hosting at https://www.reddit.com/r/selfhosted/comments/187jmte/selfhosted_alternative_to_chatgpt_and_more/
Hope you like it :)

[–] [email protected] 1 points 11 months ago

LM Studio https://lmstudio.ai/

Easiest way to get started

[–] [email protected] 1 points 11 months ago (2 children)

I think the most advanced OpenSource LLM model right now is considered to be Mistral 7B Open Orca. You can serve it via the Oobabooga GUI (which let's you try other LLM models as well). If you don't have a GPU for interference, this will be nothing like the ChatGPT experience though but much slower.

https://github.com/oobabooga/text-generation-webui

You can also try these models on your desktop using GPT4all, which doesn't support GPU ATM.

https://gpt4all.io/index.html

[–] [email protected] 1 points 11 months ago

Thanks for the write up- will the coral usb work for interference?

[–] [email protected] 1 points 11 months ago

Mistral OpenOrca is a good one. I pull about 10 to 11 tokens/sec. Very impressive. For some reason though, i cannot get GPT4ALL to use my 2080ti even though it is selected in the settings.

[–] [email protected] 1 points 11 months ago

I recommend LMSTUDIO (https://lmstudio.ai/).

It allows you to manage and download models from Hugging Face, and suggests models compatible with your machine. Additionally, it can initiate a local HTTP server that functions similarly to OpenAI's API."

[–] [email protected] 1 points 11 months ago

Can someone explain what is the benefit of running all of these models locally? Are they better than the free available chatgpt? Any good reading on how to learn/get started with all this?

[–] [email protected] 1 points 11 months ago

I'd recommend koboldcpp for your backend, SillyTavern for your frontend, and I've been a fan of dolphin-2.1-mistral-7B. I've been using the Q4_K_S. But you could probably run a 13B model just fine.

I've heard good things about the nous-hermes models (I was a big fan of their Llama2 model). I'd stick to mistral variants, personally. Their dataset/training has far surpassed base Llama2 stuff in my opinion.

[–] [email protected] 0 points 11 months ago

This is ok. Best is to go to huggingface and explore. Join openllama here on reddit. They have a leaderboard too. This is one of the good ones. https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca

[–] [email protected] -5 points 11 months ago (5 children)

Are there any non-woke chat bots that will get basic facts correct such as there being only 2 genders ?

[–] [email protected] 3 points 11 months ago (1 children)
[–] [email protected] 1 points 11 months ago (1 children)
[–] [email protected] 1 points 11 months ago (1 children)

Oh, good one. I am overwhelmed by the intelligence and cogency of your argument.

[–] [email protected] 1 points 11 months ago
[–] [email protected] 3 points 11 months ago

You're conflating gender with sex, which is a human error. You also seem to be obsessed with this topic since youre bringing it up when it's irrelevant. Cringe.

[–] [email protected] 1 points 11 months ago

I can see there are libtards and woketards in here lol

The earth is round and there are only 2 genders. So radical right ?

[–] [email protected] 1 points 11 months ago

fr fr i hate when fucking AI believes the round earth propaganda and lies instead of listening to the truth and facts

[–] [email protected] 0 points 11 months ago (2 children)

You phrased your question wrong - Reddit is too woke to respond reasonably.

And also there isn't.

Wait ...grok?

[–] [email protected] 1 points 11 months ago

Yeah I figured. My goal is to have negative karma so I'm on the right track

[–] [email protected] 0 points 11 months ago (1 children)

Reddit was woke 12 years ago when I first used it on another account. Its only gotten way worse. Its the hard truth

[–] [email protected] 0 points 11 months ago

I've only used it for 3 or 4 years. It's always been to me. Always been heavily used to push agendas that the majority don't agree with... But think it's ok cause it don't affect them...