Posted by Sir Four at 2:45pm Jun 26 '11
You must sign in to send Sir Four a message
You must sign in to send Sir Four a message
Power supply companies have created the impression that power supplies (PSUs) with higher wattage ratings are superior/desirable, maybe even necessary for a modern build. But the truth is that unless you are an intense gamer maxing out on the beefiest graphics cards, you don't need the high wattage rating--in fact you downright do not want it!
I had never paid attention to those "80 Plus" logos that some PSUs advertise, until recently when I learned that they signify the efficiency of the unit. There are five levels of 80 Plus, which are: basic 80 Plus, then Bronze, Silver, Gold and Platinum. To meet these standards, a PSU must meet or exceed certain levels of efficiency at 20%, 50%, and 80% load. For example, an 80 Plus Bronze unit must be 82% efficient at 20% load. The other 18% is wasted as heat. Now, one problem here is that when you get to the extreme ends of the range, efficiency can really drop. You may have 82% efficiency at 20% load, but 70% efficiency at 5% load.
Assume you have a 750W PSU, and your computer at idle needs 50W. That's under 7% load, well outside the sweet spot. You're not going to get the advertised efficiency. Assume you only get 70% efficiency at that wattage draw; your PSU would pull about 71W from the wall to satisfy your comp's 50W idle needs. If instead your idle were in the sweet spot getting 82% efficiency, it'd pull 61W from the wall--a savings of 10W, not to mention less heat generated.
Last year, Tom's Hardware did an interesting challenge: build a computer that idles at 25W. They found that by swapping out their 750W PSU for a 220W PSU, their build went from drawing 33W at idle to 26W.
If you're building a computer, are you expecting it to be such a low power sipper? No, probably not. But don't assume the latest hardware is power-hungry. In this video, NewEgg measures the power consumption of a build that uses the latest Intel chip (i5 2500K) and a mid-range graphics card. In the test, the system idled at around 50W and maxed out under load at 163W. Watch:
So: why do we need 750W PSUs? Most people do not! They'd be better off with a much lower wattage unit. I think even a 400W unit is more than enough for anyone but the most demanding gamer. 350W even seems to be generous.
An i5 2500K processor can theoretically draw up to 95W. The card in the video above can draw up to 116W. If both simultaneously hit their theoretical maximums, you'd be a bit above 200W. Nothing else in a typical system draws very much more than 10W. So I really see no reason why anyone would need or want a PSU rated above 400W (except to run multiple high-end graphics cards). You are simply paying extra for the privilege of wasting electricity unnecessarily. =)
lol...I suppose that's redundant.
I should have said, "the graphics card in the video above"
I had never paid attention to those "80 Plus" logos that some PSUs advertise, until recently when I learned that they signify the efficiency of the unit. There are five levels of 80 Plus, which are: basic 80 Plus, then Bronze, Silver, Gold and Platinum. To meet these standards, a PSU must meet or exceed certain levels of efficiency at 20%, 50%, and 80% load. For example, an 80 Plus Bronze unit must be 82% efficient at 20% load. The other 18% is wasted as heat. Now, one problem here is that when you get to the extreme ends of the range, efficiency can really drop. You may have 82% efficiency at 20% load, but 70% efficiency at 5% load.
Assume you have a 750W PSU, and your computer at idle needs 50W. That's under 7% load, well outside the sweet spot. You're not going to get the advertised efficiency. Assume you only get 70% efficiency at that wattage draw; your PSU would pull about 71W from the wall to satisfy your comp's 50W idle needs. If instead your idle were in the sweet spot getting 82% efficiency, it'd pull 61W from the wall--a savings of 10W, not to mention less heat generated.
Last year, Tom's Hardware did an interesting challenge: build a computer that idles at 25W. They found that by swapping out their 750W PSU for a 220W PSU, their build went from drawing 33W at idle to 26W.
If you're building a computer, are you expecting it to be such a low power sipper? No, probably not. But don't assume the latest hardware is power-hungry. In this video, NewEgg measures the power consumption of a build that uses the latest Intel chip (i5 2500K) and a mid-range graphics card. In the test, the system idled at around 50W and maxed out under load at 163W. Watch:
So: why do we need 750W PSUs? Most people do not! They'd be better off with a much lower wattage unit. I think even a 400W unit is more than enough for anyone but the most demanding gamer. 350W even seems to be generous.
An i5 2500K processor can theoretically draw up to 95W. The card in the video above can draw up to 116W. If both simultaneously hit their theoretical maximums, you'd be a bit above 200W. Nothing else in a typical system draws very much more than 10W. So I really see no reason why anyone would need or want a PSU rated above 400W (except to run multiple high-end graphics cards). You are simply paying extra for the privilege of wasting electricity unnecessarily. =)
added on 3:04pm Jun 26 '11:
"wasting electricity unnecessarily"lol...I suppose that's redundant.
added on 3:08pm Jun 26 '11:
"The card in the video above"I should have said, "the graphics card in the video above"