• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

a greenhouse effect question

sol invictus

Philosopher
Joined
Oct 21, 2007
Messages
8,613
I have a simple question about the greenhouse effect. It may or may not have a simple answer. I apologize in advance if this has been discussed here before; if so, a link to point me to the right thread would be greatly appreciated.

My cartoonish understanding of the greenhouse effect is the following: certain gases (such as CO2 and water vapor) absorb more in the IR than they do in the visible. So when sunlight comes to the earth a little gets absorbed and re-radiated into space, but most gets through and strikes the earth. It is then re-radiated in the IR (at the much lower temperature of the earth), which means much more of it is absorbed and re-radiated back to the earth by the atmosphere than was reflected when it originally came in from the sun.

So in other words the atmosphere is more opaque to radiation coming from the earth than it is to radiation coming from the sun. It acts very roughly like a barrier made of one-way glass, and so if you increase the greenhouse gas concentration and make that effect stronger, the earth warms. Simple enough.

But here's where I get stuck. Gases absorb only in certain frequency bands. The most they can possibly do is be completely opaque in a certain band. As the concentration of the gas increases, the opacity at that band increases too, but it saturates at 1. So cranking up the amount of CO2 in the atmosphere (by this simple logic) will have less and less effect as the concentration gets higher and higher.

Now of course these absorption bands are broadened by doppler effects, but those fall off very rapidly with frequency, so increasing the concentration at fixed temperature really doesn't seem to do much even taking that into account (past a certain point at least). And it seems (based on the current concentration of CO2) that we should already be close to that point - in other words that the absorption is already quite significant at current CO2 levels.

Is this wrong somehow? Is it just that non-linear feedback effects in the climate more than compensate for this... maybe because water vapor is more important than C02? Or am I missing something basic?
 
Last edited:
The explanation is rather simple, though you might not prefer to hear it from me.

They're using a logarithmic scale so a doubling of CO2 concentration gives a certain amount of temperature rise. We're nowhere near complete absorption in the particular band of most interest, which is, IIRC, the two bend modes of the CO2 molecule; 667/cm or so wave number. That frequency is particularly important because it's a hole in the spectrum of water vapor, so saturation there comes only from the CO2; in addition, that hole happens to be near the peak of blackbody radiation for Earth's temperature, so it's quite important to the heat budget.
 
I have a simple question about the greenhouse effect. It may or may not have a simple answer. I apologize in advance if this has been discussed here before; if so, a link to point me to the right thread would be greatly appreciated.

http://www.realclimate.org/index.php/archives/2007/06/a-saturated-gassy-argument/langswitch_lang/in

But here's where I get stuck. Gases absorb only in certain frequency bands. The most they can possibly do is be completely opaque in a certain band. As the concentration of the gas increases, the opacity at that band increases too, but it saturates at 1. So cranking up the amount of CO2 in the atmosphere (by this simple logic) will have less and less effect as the concentration gets higher and higher.

Your informed layman's understnding is fine, but as it turns out we're far from saturation. From the above link :

"The breakthroughs that finally set the field back on the right track came from research during the 1940s. Military officers lavishly funded research on the high layers of the air where their bombers operated, layers traversed by the infrared radiation they might use to detect enemies. Theoretical analysis of absorption leaped forward, with results confirmed by laboratory studies using techniques orders of magnitude better than Ångström could deploy. The resulting developments stimulated new and clearer thinking about atmospheric radiation.


Among other things, the new studies showed that in the frigid and rarified upper atmosphere where the crucial infrared absorption takes place, the nature of the absorption is different from what scientists had assumed from the old sea-level measurements. Take a single molecule of CO2 or H2O. It will absorb light only in a set of specific wavelengths, which show up as thin dark lines in a spectrum. In a gas at sea-level temperature and pressure, the countless molecules colliding with one another at different velocities each absorb at slightly different wavelengths, so the lines are broadened and overlap to a considerable extent. Even at sea level pressure, the absorption is concentrated into discrete spikes, but the gaps between the spikes are fairly narrow and the "valleys" between the spikes are not terribly deep. (see Part II) None of this was known a century ago. With the primitive infrared instruments available in the early 20th century, scientists saw the absorption smeared out into wide bands. And they had no theory to suggest anything different."

As another informed layman, I found that educational :).

Now of course these absorption bands are broadened by doppler effects, but those fall off very rapidly with frequency, so increasing the concentration at fixed temperature really doesn't seem to do much even taking that into account (past a certain point at least). And it seems (based on the current concentration of CO2) that we should already be close to that point - in other words that the absorption is already quite significant at current CO2 levels.

It is significant; it's way more comfortable down here than on the Moon after all. It's about a 30 Kelvin effect. Another three Kelvin on top of that - just 10% - is enough to have a significant impact on society as it's currently arranged. 6+ billion of it.
 
They're using a logarithmic scale so a doubling of CO2 concentration gives a certain amount of temperature rise. We're nowhere near complete absorption in the particular band of most interest, which is, IIRC, the two bend modes of the CO2 molecule; 667/cm or so wave number. That frequency is particularly important because it's a hole in the spectrum of water vapor, so saturation there comes only from the CO2; in addition, that hole happens to be near the peak of blackbody radiation for Earth's temperature, so it's quite important to the heat budget.

And what Schneibster said.

Awesome.

667 or so? You're not just kidding around?


(666i, the Number of the Beast's BMW, really cracks me up :).)
 
Sure, no problem.

Very far. The current concentration of CO2 in the atmosphere is a few hundred parts per million. Before we got started with industrial processes, it was a couple hundred or less. It's been as high as seven thousand or so ppm in the history of the Earth; that was hot enough to decrease the oxygen carrying capacity of the ocean sufficiently to kill off most everything that lived in the shallows, and down a ways toward the continental shelves. One of the major extinction events is currently theoretically linked to that, and it's a big one. 95%+ of everything that lived in the ocean died.

If we keep up what we're doing, we'll have the concentration up above four hundred ppm by the end of the century. That's a doubling from before we started burning a lot of coal and oil. It implies according to current models that the heat retained will increase the temperature by several degrees, and that's a lot. In addition, lots of latent heat of melting from the ice will be absorbed, but once the ice is gone, the temperature will climb more quickly.

It's worth noting that it takes a while for the heat to work; there's the latent heat of ice, and there's the oceans to absorb it, and other effects that delay its effect in terms of temperature, plus the fact that it takes a while for the radiation balance to even out between surface and atmosphere simply by absorption and re-absorption. So even if we stop making CO2 right now, there'll still be a lag (decades?) before it stops getting hotter.

Various effects of this sort, including absorption by the ocean, mitigate the temperature increase various ways or we'd already be in serious trouble. There is, in other words, a certain amount of homeostasis in the system. There are, however, limits to these effects, there are limits to how much heat the ocean absorbs and how much latent heat can be used up melting ice, for example. The nightmare is, we surpass one of these limits and the rate at which CO2 increase increases temperature goes up. We have no idea where these limits might lie. Most likely they're quite a ways up, but it's becoming clearly apparent that there is a limit we're very close to in terms of latent heat of melting in the Arctic, since pretty soon there won't be very much if any ice left there in the summer. Any heat absorbed beyond that isn't as easily lost again when the ice freezes in the winter.

To give you a realistic idea of what we're talking about here, that extinction event I talked about was the Permian extinction, of which you might have heard here and there, and the CO2 got taken back out of the atmosphere when the plants invaded the land, during the Carboniferous. That's when most of the coal deposits we know about got laid down, and the carbon in that coal is the carbon that caused the Permian extinction. What do you suppose conditions on Earth's surface will be like if we burn even a significant fraction of that?

I seriously doubt we'll be in very bad trouble before the end of the century, but if we're still doing then what we're doing now, it's not going to be much beyond that before we are, and it will by that time be inevitable. It might be inevitable by mid-century. And that's only a few decades away. We've already screwed off for nearly three decades, and it took a century before we even found out what was happening. We probably ought to get on this pretty soon.
 
Iincrease the greenhouse gas concentration and make that effect stronger, the earth warms. Simple enough.

...maybe because water vapor is more important than C02? Or am I missing something basic?

just to add a bit to the main points already made by Schneibster and CapelDodger: there is a basic complication on the earth (not on the moon) that we live on the surface. thinking of the moon as a black body is a bit more effective than thinking of the earth as one, since the moon has only "one" surface: the solid surface if effectively the location of absorption, radiation and interest.

firstly, on the earth there is also energy transport that is NOT due to radiation, but by latent heat (water plays several roles, by evaporating at the surface and then condensing at altitude, by absorbing as a vapour, and by reflecting both up (solar) and down (IR) as cloud). as vapour, it can transport energy from equator towards pole.

secondly viewed from space, the earth does NOT radiate (only) from its surface but from various levels in the atmosphere as well. even if one imagines keeping the effective (black body) temperature of the earth fixed, changing the effective altitude at which radiation "escapes to space" there can still be a nontrivial impact on the place we happen to care most about (the solid surface).

in practice, of course, ideas based on "effective altitude" are limited by frequency dependence of absorption at higher levels, my point is only to illustrate that both the temperature structure of and the inhomogeneous composition of the atmosphere introduce thermodynamics that complicate the radiation picture. easily neglected on the moon, something basic on the earth.
 
secondly viewed from space, the earth does NOT radiate (only) from its surface but from various levels in the atmosphere as well. even if one imagines keeping the effective (black body) temperature of the earth fixed, changing the effective altitude at which radiation "escapes to space" there can still be a nontrivial impact on the place we happen to care most about (the solid surface).

in practice, of course, ideas based on "effective altitude" are limited by frequency dependence of absorption at higher levels

The higher the altitude, would not a larger percentage of re radiated energy escape to space? It should be in the order of 0.1-0.2% more energy escaping than going back to the planet from altitudes of 5km.
 
Last edited:

Back
Top Bottom