this post was submitted on 15 Nov 2023
1 points (100.0% liked)

Homelab

380 readers
9 users here now

Rules

founded 1 year ago
MODERATORS
 

Wondering if anyone has a feel for the power efficiency of older server hardware. I'm reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So...if you take the hard drives out of the equation, it's probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It's like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 1 points 1 year ago (1 children)

Nah. My old server from 2012 did 35W. Dual core i5, 16GB, 120GB SSD, 3x 3TB HDDs.

[โ€“] [email protected] 1 points 1 year ago

Anyway, I'm trying to understand this whole power consumption thing too. See my other post if you're interested. And my conclusion is that it just doesn't make sense. There are a lot of factors involved. Nuanced things, like quality and of components, firmware etc. You can't just say old == bad power consumption, new == good consumption. Your 2012 server is a good example of that.

In my opinion, the best thing to do is just look up the specific model you intend to get and check what people's experiences are for that particular model. And there is your expected power consumption.