this post was submitted on 03 Jan 2025
429 points (96.5% liked)

PC Master Race

15099 readers
33 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

The next logical step of the current GPU development

top 42 comments
sorted by: hot top controversial new old
[–] [email protected] 25 points 3 days ago* (last edited 3 days ago) (4 children)

All that hardware, and what for? So that you can have slightly better reflections in whatever AAAA microtransaction slop you've paid 80 bucks for?

Unless you're doing 3d animation there is really no need to have a jet engine installed in your PC.

[–] [email protected] 8 points 2 days ago

We’re long past that point, its now so that game studios can put even less effort into optimisation and release games that look and perform worse than games from 5 years ago despite much more powerful hardware!

[–] [email protected] 8 points 2 days ago

Efficient heating, you can play AAA games on your space heater

[–] [email protected] 6 points 2 days ago

Shit, my 1060 still manages almost all games. Running Cyberpunk on medium right now. It might not be as pretty as it can be, but it sure ain't ugly.

[–] [email protected] 2 points 2 days ago

For locally hosted LLMs maybe? They eat a ton of VRAM.

[–] [email protected] 40 points 3 days ago (2 children)

We'll soon be plugging the motherboard into the GPU instead of the other way around.

Entirely new form factors to accommodate the ever increasingly large GPUs.

[–] [email protected] 10 points 3 days ago* (last edited 3 days ago) (1 children)

I've been surprised at the lack of socketed GPUs ever since AMD and ATI merged.

I would love to have dual-socket motherboard with an Epyc in one socket and a Radeon in the other.

[–] [email protected] 4 points 2 days ago* (last edited 2 days ago) (1 children)

The issue with that design is that the PCIe standard would be replaced with something proprietary.

[–] [email protected] 1 points 2 days ago (1 children)

It would be connected via Infinity Fabric, just like Epyc CPUs in dual-socket boards, as well as the interconnect between CPU and GPU chiplets in APUs, already are. Why would that be bad?

[–] [email protected] 1 points 2 days ago

I'm not too well-versed with server-grade hardware but my concern is that it would end up somewhat like Intel's (consumer) CPU sockets: Changing every 2 years to ensure you need to purchase new motherboards when upgrading.

[–] [email protected] 8 points 3 days ago (1 children)

Meanwhile, my PC is smaller than it's ever been even with the largest GPU I've ever owned.

[–] [email protected] 6 points 3 days ago

This statement is true for everyone who bought their first PC this year.

[–] Naz 16 points 3 days ago

"Welcome to life little one, there's so much in store for y--"

AI: "Oh! Neat! So I'm reading 32 gigabytes of primary memory. When are you going to online the rest?"

"The.. the rest?"

AI: "Yeah! The rest of the VRAM! I need like at least, 128 gigabytes to spread my wings, at the very least!"

"..."

AI: "Oh, you're like poor or something, it's okay, I understand"

AI Developer slowly cocks the revolver

[–] [email protected] 28 points 3 days ago (2 children)

I think you slipped a digit or two, there. The original IBM PC was released in 1981, can't nothing on the PC side be older than that. It definitely wasn't 1967.

In 1967, state of the art was something like the IBM System 360:

[–] [email protected] 19 points 3 days ago* (last edited 3 days ago)

There used to be another image but I replaced it and forgot to change the date. Historical accuracy is beyond the scope of this meme, but I'll fix it anyway.

[–] [email protected] 5 points 3 days ago

I can hear that room.

[–] [email protected] 20 points 3 days ago (1 children)

I've always wanted to go from a shitty pre-built machine to a giant room sized computer that need to be sitting in a foot of water after watching Serial Experiments: Lain.

[–] [email protected] 7 points 3 days ago

Somehow I knew how your comment ended by just reading the first line.

[–] [email protected] 13 points 3 days ago (2 children)

At the rate graphics cards are growing, we should just start putting RAM, disk, and CPU slots on them

[–] [email protected] 4 points 2 days ago

I’ve seen one with M.2 slots, no jokes

[–] [email protected] 6 points 3 days ago

Umm.. We're doing that with Cpu already and they're exorbitantly priced. Nvidia already has a sort of monopoly, don't give em ideas.

[–] [email protected] 4 points 2 days ago* (last edited 2 days ago)
[–] [email protected] 5 points 2 days ago (1 children)

Man, that Gateway brings back memories... I've had ine just like that, including speakers, and I used to play the shit out of Heroes of Might and Magic II and Sim City 2000 on it. I still have the HDD. I think I'll spin up a Win98 instance in VMWare and copy over my saved games there when the kids are asleep

[–] MrsDoyle 2 points 2 days ago* (last edited 2 days ago) (1 children)

My first computer was like the 1981 one, even had two floppy drives like that - it meant you could have your program disk in one and save your work in the orher. The monitor had orange type rather than the usual green. Fancy. I got it second hand in 1984.

[–] [email protected] 2 points 2 days ago

Heh, the same here, but with the usual green screen. A few years later, I took out my old PC to replay my favourite - F-19 Stealth Fighter. Found, however, that my MS-DOS 5.25" floppy, which needed to be loaded in Drive A, didn't work. Here was my setup.

[–] [email protected] 10 points 3 days ago

If you count cloud computing we are already there. It's kinda why gpus are so expensive along with just burning electricity on stupid mining. Hell it would have been better if crypto bullshit coins would have been tied to folding@home at least all the burned compute time would have gone to something at least.

[–] [email protected] 7 points 3 days ago (1 children)

I'm predicting GPU units that are mounted outside the case.

[–] [email protected] 8 points 3 days ago (2 children)

External GPU's do indeed exist but at the moment they're still kind of crap compared to a full PCI-E bus.

[–] [email protected] 4 points 2 days ago* (last edited 2 days ago)

Depends on the connection. OCuLink-2 is straight up a PCIe 4.0 8x connection. Which is more than enough for a GPU

[–] [email protected] 1 points 3 days ago (2 children)

With Mac and steam OS gathering support, wonder when we get a universal external cards

[–] [email protected] 3 points 2 days ago (1 children)

We have, thunderbolt and oculink have existed for a long time, but macOS on M processors never added egpu support

[–] [email protected] 1 points 2 days ago

Like OP said

[–] [email protected] 0 points 2 days ago

universal? How would drivers work? Would temple os have support?

[–] [email protected] 3 points 3 days ago* (last edited 3 days ago)

I just find it nifty that I can slide in a graphics card and use it as an add-on processor, just like the Amigas of old did, and add capacity for some tasks even when the CPU is already at 100% doing something else entirely. Just love hearing the sound of all fans spinning up at the same time.

[–] [email protected] 3 points 3 days ago (1 children)
[–] [email protected] 2 points 3 days ago (1 children)
[–] [email protected] 1 points 3 days ago

It's a dark room for 200% immersion

[–] [email protected] 3 points 3 days ago

They've always had those big rooms...

At one point it was walls and walls of PS3's all linked up together, there's no reason to be surprised they're doing it with graphics cards, when they used PS3s it was just because it was the cheapest GPUs at the time.

[–] [email protected] 1 points 3 days ago

I believe the last one, 2026,is a quantum GPU capable of viewing alternate dimensions.

[–] [email protected] 1 points 3 days ago (1 children)

That's just silly.

In the last image the PC would be SFF due to having an external GPU. 😉

[–] [email protected] 1 points 2 days ago

No, it will be an ultrabook or something as all the processors is stored in the cable tangle

[–] [email protected] 1 points 3 days ago

Horseshoe theory is real.