this post was submitted on 18 Mar 2024
996 points (89.9% liked)

Lemmy Shitpost

26381 readers
2513 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (4 children)

Technically the technology is open to the public but regular people cannot afford to implement it.

The thing that makes Large Language Models hardly functional is scaling up their databases and processing power of one of several of their small models with specialized tasks. One model creates output from input, another model checks it for accuracy/coherency, a third model polices it for things that are not allowed.

So unless you've got a datacenter and three high powered servers with top-grade cooling systems and a military grade power supply, fat fucking chance.

[–] [email protected] 4 points 6 months ago

I can run a small LLM on my 3060, but most of those models were originally trained on a cluster of a100s (maybe as few as 10, so more like one largish server than one datacenter)

Bitnet came out recently and is looking like it will lower these requirements significantly (essentially training a model using ternary numbers instead of floats to reduce requirements, which turns out to not lower the quality that significantly)

[–] [email protected] 3 points 6 months ago

They should do to AI what they make me do at work: More with less.

[–] [email protected] 3 points 6 months ago

Basically Mistral, check /lmg/ in /g/, if you have a GPU newer than 2 years you can probably run a 32B quantised model.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago)

Haha try the entire datacenter.

If LLM was practical on three servers everyone and their mum would have an AI assistant product.