this post was submitted on 29 Oct 2023
1 points (100.0% liked)

AMD

26 readers
4 users here now

For all things AMD; come talk about Ryzen, Radeon, Threadripper, EPYC, rumors, reviews, news and more.

founded 1 year ago
MODERATORS
 

The amount of carbon this has undoubtedly put into our atmosphere is really my main concern. Yes I know you can do hacky workarounds to fix this, but how many of their consumers did this? Roughly none. What a waste of our planets resources.

top 26 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago (1 children)

I literally haven't had this issue for months now. Whatever is wrong with yours isn't AMDs fault because it's been solved for ages for vast majority.

[–] [email protected] 1 points 1 year ago

Having a 60fps and 144fps monitor is my fault? Get real

[–] [email protected] 1 points 1 year ago

AMD doesn't seem to be able to get past their issues with memory clocks at idle with multiple (especially asymmetric) display configurations.My Vega 64 did this when I first got it, after about a year of driver updates, they fixed it and my whole system would draw about 65w at idle.My 6900XT has always done it (running 2x identical Freesync 60Hz UHD monitors and an LG UHD TV) with no difference if I unplug the TV and the combination of that and my 5950x seemingly being worse at idling than the 2700x and 3900xt CPUs I was running before and my replacing my mobo with one with a lot more integrated hardware has now left my system idling at ~150w. It sucks but I just try not to leave it on when I'm not using it. Funnily enough, with the 6900XT being so much more efficient than Vega was, my system actually draws about 100w less when playing FFXIV at 4K, which TBH is what I do most of the time, although Starfield reminded me what a space heater this thing can be when it's pushed!

TBH I don't know if the grass is greener on the green side because I haven't owned an Nvidia card since GTX 970.

[–] [email protected] 1 points 1 year ago (1 children)

The amount of carbon emitted depends on your region's power mix. At night, when solar is unavailable, yes, it might be higher, but honestly, 100W is nothing (0.1kW). If your PC isn't put to sleep or turned off, and instead idles most of the day, those are entirely your emissions.

Level 3 EV chargers at dedicated stations draw up to 200kW (50/100kW vehicles are common now), so imagine when 95% of the world population has EVs. We're rapidly heading toward disaster without the necessary infrastructure and no one wants to hear it.

Humans breathe out CO2 as well (and expel methane, an even stronger GHG), and there's 8+ billion of us doing that 24/7. We're carbon-based life, so carbon is always going to be emitted. The issue is that we deforested and paved over our carbon sinks because we're shortsighted anytime money is involved.

Also, good luck solving your issue.

[–] [email protected] 1 points 1 year ago

Yeah you put it in better perspective than anybody. You are right, It’s incredibly minuscule, but I think there’s value to caring about every Wh or kWh. A little bit everywhere goes a long ways

[–] [email protected] 0 points 1 year ago (1 children)

Considering my PC is on usually like 12-16h a day (I work from home) and my 4090 eats about 17W at idle (1440p 240Hz + 1080p 240Hz or 4K 120Hz), I wonder if it is already cheaper compared to if I got a 7900xtx instead.

[–] [email protected] 0 points 1 year ago (1 children)

Doing a little calculator check, you probably only saved around 50 to 100 bucks depending on how expensive electricity is in your area. Of course this number will only increase over time. Looking back, i’d probably recommend a 4080 over a 7900xtx to my brother. He can’t even run it full speed because of the massive heat it produces (400w) full load and the idle power use.

[–] [email protected] 0 points 1 year ago (1 children)

Electricity in EU can easily be 0.4$ per kWh depending on the country. If we assume 80w difference, 14h per day and 0.4$/kWh u get about 140$ per year.

[–] [email protected] 1 points 1 year ago

But you don't sit 14h per day doing nothing, don't you? You can consider only idle (doing literally nothing), browsing and watching movies for this power draw, so it may be max 8h if you work at home, but maybe 2h if you use your PC primarly for gaming so the cost would be ~20$. It's still wasted money and it shouldn't happen, but realistically it's nowhere near 140$.

[–] [email protected] 0 points 1 year ago (1 children)

OP, these posts inevitably lead to tips, ehat worked for me, a restatement of the problem, and digress from there.

I for one agree with u 100%. There are NOT user-side solutions for a great many of us.

Amd also asks too much of the user. I know i musta logged well over 200hrs on the idle power problem since i bought in February.

[–] [email protected] 1 points 1 year ago (1 children)

Vast majority have reported this issue as fixed ages ago.

[–] [email protected] 1 points 1 year ago

Thx for making making my point for me.

90w idle, down from 120w, (vs roughly 26w for 6000 gpus on same systems) isnt fixed for a great many folks.

[–] [email protected] 0 points 1 year ago (2 children)

Mine idles at 7w with two monitors. The trick for me was enabling Free sync on both of them.

[–] [email protected] 0 points 1 year ago (1 children)

So fuck people who dont have freesync, right?

I have my monitors lose signal randomly when freesync is on. 🤷

[–] [email protected] 1 points 1 year ago

Sounds like a hardware issue

[–] [email protected] 0 points 1 year ago (1 children)

MIne does 5w idle with 3x 1080p monitors.

The trick was to reduce the 165 Hz to 60 Hz..

[–] [email protected] 1 points 11 months ago

That's not a good 'solution' - what if someone wants to game on 120 hz variable refresh rate?

[–] [email protected] 0 points 1 year ago (1 children)

I also had the same issue. I could get 7-10 on dual monitor but 3 monitors never went below 100w.

I changed to a 4090 and my idle is 25-30w with 3 monitors.

[–] [email protected] 0 points 1 year ago (1 children)

Like others say nvidia has same issues with 3000 series. The Nvidia placebo is real.

[–] [email protected] 1 points 11 months ago
[–] [email protected] 0 points 1 year ago (2 children)

What triggers this high usage?

I use a 3440 x 1440p LCD screen at 165hz

Asrock Phantom Gaming OC 7900xtx

Idle power draw ranges between 10-30W

I don't get the high usage you mentioned

[–] [email protected] 0 points 1 year ago (1 children)

I believe it occurs most commonly with dual monitor setups

[–] [email protected] 1 points 11 months ago

Because a dual monitor setup is pretty common.... it sounds like any setup that is more than 1 monitor - when at least one or more monitors are more than 60 hz - then it happens? I dunno if resolution impacts anything - if they have more than one monitor, these days, it seems that at least one is higher than 1080p.

[–] [email protected] 0 points 1 year ago (1 children)

I think it's when you use 2 monitors witha dif refresh rate and/or resolution

[–] [email protected] 0 points 1 year ago (1 children)

I've got the triple threat, 3 monitors at different resolutions and refresh rates. 1440p 144hz, 3440x1440p 165hz, 4k 120hz. Idle power draw is 120w. The fan doesn't shut off unless ambient temp falls below 72f/~22c.

[–] [email protected] 1 points 11 months ago

Did you try the latest AMD driver and using free sync?