I have an HP g3 mini and a Dell Optiplex flying around, both similarly specced. The HP has an i5 6500t and 16gb DDR4 RAM, the Dell has 8gb DDR3l, so nothing too different.
However, the Dell draws around 15W while idle, the HP one 5W.
The only difference I could think of (and that is in my power to change) is the PSU. The Dell has one of those SFF PSU for up to 180W while the HP has an external 65W power brick with a barrel jack.
So my question is: Does anyone have experience with one of those Pico PSUs? I guess they should be more efficient? I’m not planning to put anything power hungry into the optiplex.
The answer for your question is ‘no’.
You’re never going to reduce power usage substantially by swapping PSUs, because there’s just not enough efficiency gains to be had even if a Pico PSU was more efficient which they really aren’t.
You say the hardware is ‘nothing too different’ but you mention ddr4 vs 3, which makes me think the Dell is a generation or few older which could easily impact power draw by 10w.
Particularly in low-load scenarios there can be quite a big difference when it comes to PSU efficiency. While newer ATX PSUs have become better with regards to efficiency at low load, a Pico PSU can still be quite a bit better. Older ATX PSU often don’t even reach 60 % efficiency at 5 % load (which would be a typical load for such a system at idle), sometimes considerably less than that. At the same load a Pico PSU can easily be at 85 % efficiency.
Of course, at higher loads the difference is way smaller.
That’s fair; I wasn’t really considering how poorly performing PSUs were at extremely low loads, despite knowing that they are.
Odd that a random brick would be substantially better than a same-era actual PSU, but I suppose it’s hard to say without more specifics.
Switching power supplies (“bricks”) are generally more efficient than linear power supplies because they lose less energy as heat. that’s were the difference comes from. (Of course they have drawbacks as well, like increased noise)
True, but at that point 60% vs 85% on a load of say 10W is a difference of about 5W draw from the wall. If you live somewhere with high electricity rates it might be worth it, but otherwise at the usual $0.10/kwh that’s about $4 per year.
The other situation where it makes sense is off-grid setups, where wasted power is a big deal.
OP mentioned $0.40/kWh, so that would be about $17 per year with a 5 W difference.
They both have (almost) the same CPU, just once in the T variant.
The T variant is the low-power, lower clocked (3.2ghz vs 2.5ghz) almost half the TDP (65w vs 35w) variant; kinda the whole point is it’s going to use less power.
Yes, but ⅓ of the power? That’s seems like a lot.
The T variant is for thermally constrained systems. May not use less power, for example if a non-T completes a task faster and goes to idle