We’ve had some recent activity around commercial refrigeration performance here at Powerhouse Dynamics, which got me thinking about how our personal refrigerators are performing. Similar to a recent post I did on my dehumidifier, I decided to take a look at “real-world” energy consumption data of several refrigerators versus manufacturers’ specifications.
In a previous post on my least favorite appliance in my home – my basement dehumidifier – I discovered the unit I own was consuming substantially more energy than the manufacturer’s specifications. Also, I could find no reasonable explanation for why this was so.
My only thought since writing the post is that the standards manufacturers are allowed to use for specifying energy consumption are the equivalent of measuring a car’s mileage using a test track with a downward slope and a consistent tail wind.
However, it’s easy to verify your car’s mileage yourself, causing the “downhill test track with tailwind” methodology to fall out of fashion in favor of tracks with driving conditions that seem to mimic the real world in a reasonable way. As a consequence, all the cars I have owned in the past 20 year are quite capable – and have consistently demonstrated – the ability to achieve their advertised mileage, assuming I’m not suffering from a case of lead-foot.
Are refrigerator manufacturers using the “downhill test track with tail wind” methodology to specify energy consumption, or are they using conditions that approximate the real-world?
I have three data points that provide a small window of insight, courtesy of SiteSg=age:
- Samsung 26 Cu Ft Side by Side, LED lights, with dual air, yr 2011
- GE Profile Series side by side with in-door ice and water dispenser, yr 2005
- Sub-Zero model 650, yr 2000
One fridge is owned by our VP of Engineering, one by our Sales Manager, and one by our CEO. (Side challenge: match the fridge to the Powerhouse Dynamics employee!)
Why don’t I have my own fridge here for comparison? It’s a long story, but what’s the saying about the cobbler’s children?
Each refrigerator has been tracked by SiteSage – the Samsung and Sub-Zero for over 1 year each. The GE unit has been tracked for three months, so its energy consumption has been annualized (x4).
In a nutshell, none of the three refrigerators matched or beat their manufacturer’s specifications. On average, these units were 18% higher than specified by their manufacturer.
This isn’t nearly as bad as I had feared it would be, although I’m disappointed that none of the three units beat spec.
Here is the data:
Specification – kWh/yr
Actual – kWh/yr
|GE Profile Series|
*estimated based on year 2012 specs, less ~2%/yr. avg. efficiency gains for refrigeration
The DOE has published their refrigerator test method guidelines, entitled “Energy Conservation Program for Consumer Products: Test Procedures for Refrigerators, Refrigerator-Freezers, and Freezers”. It’s an incredibly detailed document, and it’s clear there is a lot of thought and analysis behind it. While I know that three measly data points is hardly an indictment of an entire process, it points to a potential issue.
My personal theory is that because checking the “mileage” of a fridge is so much more difficult for consumers than checking the mileage of a car (that is, until the eMonitor!), the process agreed to by the manufacturers and DOE has been skewed to test conditions that, while maybe not scientifically unreasonable, are favorable to less energy consumption and not necessarily typical of real-world use.
SiteSage customers: want to contribute to the discussion? Send me your refrigerator make, model, and year. We can gather your data and include it in our survey.
Anyone want to wager their system meets or beats the manufacturer’s spec?