Edit: A better title would be: What makes certain machines much more power-efficient than others?
I have the above machine, and the idle power consumption is too high to my liking.
Full specs:
- HP Z440 workstation
- Intel C612 chipset
- Xeon E5-1650V3 CPU (Haswell)
- 32 GB RAM (4 x 8 GB modules), ECC of course.
- 512 GB SSD
- Nvidia Geforce GT 710, 1 GB
- 2 fans: 1 x CPU + 1 x case. (+ one more in the PSU)
That’s it, nothing else.
It draws about 100-110 W idle. I find this a bit too much.
I understand this is a 140 W TDP CPU, but when I research this topic, every article and comment say the max TDP for the CPU doesn’t matter. That’s just the indication of the consumption at maximum load. At idle, all CPUs should consume minimal power in theory, regardless of the max TDP. Haswell and Broadwell CPUs should be especially power efficient compared to their Sandy Bridge and Ivy Bridge predecessors.
I’ve also seen a lot of comments that it’s mainly the other components that matter, not the CPU. But I don’t have a lot of components either… The setup is quite minimal.
It’s not just the power consumption that bothers me, but the fact that I don’t understand why! Especially considering that there are examples of servers based on similar or older architecture, consuming as little as 30 W. For example this comment. A server with an Ivy Bridge CPU, E3-1230 v2, idle power consumption 30W. Sure, that CPU is 70 W TDP, not 140 W, but…
But then the saying that “TDP does not matter” can’t be true! It means that the CPU’s TDP does matter, and it matters a lot even when it comes to idle power! End even if it's not true, it still doesn't add up, doesn't explain why my setup has 3-4 times more power draw.
Or the other possibility is that it’s the motherboard. There’s something on the motherboard that servers or other machines don’t have that eats this much power. But what is it?
It just doesn’t add up! The VGA card is about 20W max. I checked. Probably less as it’s not under load. RAM + SSD + fans 20W max, but probably less. And there is nothing else. This means the CPU + motherboard consume at least 60-70 W, but probably more. Why? And what is different in similar machines or servers that consume much less?
Anyway, I'm trying to understand this whole power consumption thing too. See my other post if you're interested. And my conclusion is that it just doesn't make sense. There are a lot of factors involved. Nuanced things, like quality and of components, firmware etc. You can't just say old == bad power consumption, new == good consumption. Your 2012 server is a good example of that.
In my opinion, the best thing to do is just look up the specific model you intend to get and check what people's experiences are for that particular model. And there is your expected power consumption.