Trakar
Penultimate Amazing
- Joined
- Oct 20, 2007
- Messages
- 12,637
...Everyone knows it takes more energy in the winter to get up to 67F than it does in the summer to get down to 67. Why? Because the cold is much colder than the hot is hotter.
Science.
I thought you would have been able to see from my example it takes more electricity (the money) in the winter to heat than in the summer to cool.
Surely you jest?
The primary issue is to first concentrate on building codes to insure properly constructed and insulated homes, and then to look at the heating and cooling systems that one is using to produce heating or cooling. And then finally we compare the times when environmental conditions require heating or cooling.
On a per unit of energy basis and looking simply at the cooling or heating of one degree, the energy that must be moved is the same whether we are warming or cooling. Additionally, there are personal "comfort" differences.
(Most that I know, don't turn air-conditioning on til the inside temps exceed 80F/27C, and usually set the thermostat at 77F/25C. In the winter, Heating generally isn't activated until inside temps drop into the mid-low 50sF/10-13C and thermostat is generally set at 65F/18C. For my part of the planet, this means that our heating and cooling periods are roughly equal at about 2 -2.5 months for each out of the year.)
The biggest difference comes in the equipment used to generate heating or cooling, and how well insulated and designed the structure being heated or cooled is.
Average U.S. temperature increases by 0.5 degrees F
New 1981-2010 'normals' to be released this week
http://www.noaanews.noaa.gov/stories2011/20110629_newnormals.html
Trading heating for cooling is rarely an even trade and I've never seen a situation where on a per unit basis, it was cheaper and easier to cool from a given temperature than to heat by the same increment, mainly because there are relatively cheap means of heating, but cooling tends to expensive and much more complicated.
