This week, the American Geophysical Union is holding its annual fall meeting, which always includes a whole slew of fascinating scientific presentations. You can read handy summaries of some of the talks over at the AGU's blog—here's a session on what more carbon-dioxide in the air will mean for plant life; here's a more frivolous session on whether the discovery of aliens would shake humanity's faith in religion (according to surveys, the answer appears to be no).
But one of the highlights—at least from a climate-science perspective—appears to have been Richard Alley's lecture on the role of carbon-dioxide in explaining historical changes to the Earth's climate. Here's a terrific write-up of the talk, and it's really worth your time if you're interested in the subject. Alley explains how we can measure what CO2 levels were millions of years ago, examines some of the "mysteries" in our planet's geological record (like the "snowball Earth" debate), and makes the case for why carbon-dioxide has always been the most dominant driver of major climate shifts. (The difference, of course, is that changes in CO2 level have historically been natural and transpired over millions of years—this time around, humans are setting things in motion much more quickly.)
Alley also discusses the IPCC's estimate that we'll see a roughly 3°C (5°F) average rise in temperatures for every doubling of carbon-dioxide in the air. Many people are under the impression that this forecast is solely based on wonky computer models. But as Alley points out, you can check this against the historical record. And, for the last 425 million years, Earth's "climate sensitivity" has been remarkably consistent—global average temperature rises about 3°C for every doubling of CO2. One of the big uncertainties is that occasionally the Earth's climate appears even more sensitive than that—and that's where much of the current research and debates are focused on.