Gamer's Nexus is gonna need a bigger shirt this time.
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - [email protected]
- Gaming Laptops - [email protected]
- Laptops - [email protected]
- Linux Hardware - [email protected]
- Microcontrollers - [email protected]
- Monitors - [email protected]
- Raspberry Pi - [email protected]
- Retro Computing - [email protected]
- Single Board Computers - [email protected]
- Virtual Reality - [email protected]
Icon by "icon lauk" under CC BY 3.0
They should switch to a shower curtain.
It's like 600 watts and 50+ amps, its going to get more than hot
50 amps if evenly distributed shouldn't be that much of a problem. The distribution is the problem though. And the lack of headroom just makes it all the more worse. 12v 2x6 should have been limited to 500 watts or even 450 watts, and even then that's pushing it as we found out with the 4090 and bad connections.
The fact that we're discussing literal space heater output levels for a fucking graphics card is insane enough.
Since putting a 4080 in my bedroom, I've not once run my heat, and usually have to keep my window open at least an inch too cool the room off...
I live in Canada, it's currently winter. Window's open.
Really leans into the fact that efficiency means absolutely nothing to GPU manufacturers compared to a 5% gain just by stuff some more watts into it
The 9070 non-XT seems to be relatively efficient per frame!
AMDs always get a wack, ultra safe, and high voltage. I undervolt all my AMD GPUs and CPUs. Makes quite the difference.
Tldr on the problem is that small differences in resistance between conductors and contacts in the cables can result in different currents in each wire. Previous cards would split pairs of pins up and have each of them current balanced (same current drawn in through each). The 40/50 series shorted all of the pins together, which doesnt allow for any current balancing on the GPU side.
Additionally, previous cards would refuse to turn on if any of those shorted pairs were not present - something the new cards can hardly detect. This could allow all but one high side wire to be cut and the GPU might not realize and try to draw all 50+A through a single conductor which is rated for <10A
The 9070xt that Steve at gamersnexus looked at which had the 12vhpwr connector also had this flawed design, but the power on the 9070xt is much much lower, so you are less likely to see a problem on those cards.
Maybe they should build hardware around these Graphics Cards, purpose built for power and heat distribution, say, a console of some type
The KFC gaming airfryer is finally feasible.