this post was submitted on 19 Jul 2024
328 points (95.3% liked)

Lemmy Shitpost

26670 readers
4099 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 1 year ago
MODERATORS
 
top 23 comments
sorted by: hot top controversial new old
[–] [email protected] 60 points 3 months ago (1 children)

Forget the board -- can your whimpy-ass power supply handle the load?

[–] RandomlyRight 42 points 3 months ago (4 children)

No :(

I have a separate gaming PC and am considering to just use that hardware for my NAS and create a VM for gaming

[–] [email protected] 31 points 3 months ago (1 children)

You have put yourself into this black hole lol.

"I might just get a- Oh god my gaming rig is now my secondary PC and my credit card hurts. How did this happen?!"

3090s snicker evily in the background

[–] RandomlyRight 3 points 3 months ago

Im used to this from the whole "build your own gaming pc/nas" rabbit hole. Now it’s just some extra gpus and I might be able to have a two in one build (which will of course offset any costs for more 3090s /s)

[–] [email protected] 13 points 3 months ago* (last edited 3 months ago) (1 children)

Didn't someone just make a post about a game stream server that would allow multiple gamers to use the same machine? ~~Not with VMs, but multiple users and virtual displays.~~ Using docker.

You'd connect to it via any moonlight client, and it creates the environment for you to use the machine for whatever.

Edit: Yes

[–] RandomlyRight 2 points 3 months ago

Yeah it’s a pretty cool project and I’ll definitely use it. However nothing can beat a straight connection from monitor to gpu, so I’ll probably use passthrough for the gpu when gaming

[–] [email protected] 4 points 3 months ago

Look at it this way: not only can you run your own AI stuff yourself, you can have your own cloud gaming too!

[–] [email protected] 4 points 3 months ago

Check out Games on Whales for self hosted game streaming!

[–] [email protected] 8 points 3 months ago (6 children)

Why use commercial graphics accelerators to run a highly limited "AI"-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090's.

[–] [email protected] 30 points 3 months ago (1 children)

Well yeah, but 10x the price....

[–] [email protected] 1 points 3 months ago* (last edited 3 months ago) (2 children)

Not if it's for inference only. What do you think the "AI accelerators" they're putting in phones now are? Do you think they'd be as expensive or power hungry as an entire 3090 for performance if they were putting them in small devices?

[–] [email protected] 8 points 3 months ago (1 children)

Ok,

Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.

[–] RandomlyRight 5 points 3 months ago

I'd be interested (and surprised) too

[–] RandomlyRight 1 points 3 months ago* (last edited 3 months ago) (1 children)

Yeah show me a phone with 48GB RAM. It’s a big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference

[–] [email protected] 1 points 3 months ago (1 children)

I wouldn’t say it particularly sucks. It could be used as a powerhouse hosting server. Docker makes it very easy to do no matter the os now a days. Really though I’d say its competition is more along the lines of ampere systems in terms of power to performance. It even beats amperes 128 core arm cpu at a power to performance ratio which is extremely impressive in the server/enterprise world. Not to say you’re gonna see them in data centers because price to performance is a thing as well. I just feel like it fits right into the niche it was designed for.

[–] RandomlyRight 1 points 3 months ago

How could you solve the problem of storage expansion? I assume there exists some kind of thunderbolt jbod thing or similar

[–] [email protected] 18 points 3 months ago

Because those specific cards are fuckloads more expensive.

[–] [email protected] 6 points 3 months ago

What are you recommending, I'd be interested in something that's similar in price to 3090.

[–] [email protected] 5 points 3 months ago (1 children)

It's for inference, not training.

[–] [email protected] 2 points 3 months ago (1 children)

Even better, because those are cheap as hell compared to 3090s.

[–] [email protected] 1 points 3 months ago

But can they run Crysis ?

[–] [email protected] 4 points 3 months ago* (last edited 3 months ago)

Would you link one? Because the only things I know of are the small coral accelerators that aren't really comparable, and specialised data centre stuff you need to request quotes for to even get a price, from companies that probably aren't much interested in selling one direct to customer.

[–] [email protected] 3 points 3 months ago

Huh?

Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.