this post was submitted on 02 Sep 2024
614 points (97.2% liked)
memes
10203 readers
2576 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
PSUs are waaaaay more efficient when operating closer to their rated capacity. Pulling 200W through a 1kW power supply is like making a marathon runner breathe through a straw.
The sweet spot is the 40-60% load.
But it doesn't make that much of a difference. The efficiency swing is maybe 10%. Like an bronze 80 rated PSU will have a minimum efficiency of 80%, but even if you're at the 50% load mark it won't be over 90% efficient.
The main point (to me anyways) is that its dumb to pay more for a power supply just so you can pay "more* on your power bill. If your idle load is 100W and your gaming load is 300W, you've got no reason running more than a 600W PSU
I've got a 850W power supply, which I bought 2-3 years ago in anticipation of the RTX 4000 series. My usual load with a GTX 1080 was 150W and now my entire system uses 520W completely loaded. Do I count? :)
I have a 4090 in my Ryzen 7700X system and a power meter; 850W is overkill for a 4090. My system never uses more than 650w. What's more important than the power rating is buying a high-tier PSU with good overcurrent protection, cause the 4090 tends to have power spikes even a good 750w PSU should be able to handle.
If you bought a PSU certified for PCIe 5, then you're most likely fine. If you didn't have to use a squid adapter to plug in your GPU, then you're more than likely good to go so long as you didn't buy a shit tier PSU.
While true. How much would it actually save you in electricity? If you upgrade every year wouldn’t it be cheaper to just buy the bigger psu outright and pay the extra cost in electricity so you don’t have to buy another PSU when you get more power hungry components.