jump to navigation

UNDERESTIMATING CLIMATE CHANGE March 27, 2012

Posted by wmmbb in co2 emissions.
Tags: ,
trackback

Forecasting is a practical business.  We are far better off listening to the weather forecast, even if on 18 or 19 days of the year it may be incorrect. With 95% accuracy, taking the umbrella when rain is forecast pays off more often than not.

The costs associated with climate change  when not anticipated , including the costs not just in infrastructure but in human welfare are far greater. So climate change is a moral, not solely a scientific issue.

Don Arthur at Club Troppo suggests:

The trouble is, most people don’t have time to become experts in every technical field that affects their lives. Modern society functions because we have a division of intellectual labour just as we have a division of physical labour. Attempting to live a physically self-sufficient by growing your own food, building your own house and weaving your own cloth quickly leads to poverty. Attempting to live an intellectually self-sufficient life would have the same effect.

Don is too kind to observe that many of us do not have the cognitive ability either, and to build local communal organizations to work effectively, while not impossible would take time. As well it may require a difficult cultural transition from our individualist and materialist orientation. For some changes, I would prefer to punt on the weather forecast. Yet since we live in a democracy, each of us has some responsibility.

One thing we might do is to familiarize ourselves with the science. Writing in Nature, John P Smol observes that the effects on climate change can now be seen on animal life, agriculture as well as the more obvious decrease in Arctic icecaps. Providing useful background, at least for me, he writes:

Understanding climate change requires knowledge of the gases in the air. More than 99% of the air is nitrogen and oxygen, and the remaining 1% is a mixture of other gases, including carbon dioxide. For every million molecules in the air, there are fewer than 400 carbon dioxide molecules, stated as 400 parts per million (p.p.m.). Yet carbon dioxide has garnered a great deal of attention from scientists studying climate change because it is a ‘greenhouse gas’. Greenhouse gases absorb the heat emitted by Earth, warming the atmosphere and the planet.

Methane, ozone, nitrous oxide and chlorofluorocarbons are also greenhouse gases. They are less abundant than carbon dioxide but some are even more powerful. Methane, which is emitted by industry, bogs, rice paddies and belching cows, makes up just 1.8 p.p.m. of the atmosphere but is about 20 times more powerful as a greenhouse gas than carbon dioxide.

Despite their small contribution to the composition of the atmosphere, greenhouse gases have a huge effect on climate. Without them, the average temperature on Earth would be around −18 °C instead of a comfortable 14 °C. Because of their power, even slight changes in their concentration can have large effects on temperature.

In the 1950s, the scientific debate around carbon dioxide focused on whether or not it was accumulating in the atmosphere. Globally, cars, factories and other activities that burn fossil fuels emitted more than a billion tonnes of carbon into the atmosphere annually in the 1950s, but many scientists believed that oceans and plants soaked up nearly all of it.

Against the backdrop of this debate, David Keeling, a chemist at Scripps Institute of Oceanography in La Jolla, California, sought to find out. In 1957, Keeling set up an array of newly developed gas analysers on the summit of Hawaii’s Mauna Loa volcano to measure atmospheric levels of carbon dioxide. He chose the site because of its isolation and elevation (about 3,400 metres), which avoided local sources of carbon dioxide that might amplify the readings if the observatory were lower or in an industrial or urban setting. At the beginning of the experiment, the average monthly value was 315 p.p.m. Keeling saw the values drop from May to September and then rise again into the next year. The cycle continued, down and up, down and up, decreasing in summer when plants soak up carbon dioxide and grow, and rising again in autumn and winter. Looking at this pattern was like watching the planet breathe.

After a few years, Keeling spotted another trend: carbon dioxide levels were rising from one year to the next. Industry, transport and other activities were adding more carbon dioxide to the atmosphere every year, but the oceans couldn’t keep up. We now know that, over the long term, about half the carbon dioxide we add to the atmosphere stays there; the oceans absorb about 25% and plants soak up the remainder. By June 2011, the atmospheric carbon dioxide concentration had risen to 394 p.p.m.

The current carbon dioxide level far exceeds its natural fluctuation (180–300 p.p.m.) over the past 800,000 years. Scientists know what the historical range was from studying the planet’s natural archives, such as tree rings, the sediments of lakes and oceans, and ice cores. These archives are known as proxy records.

The ice cores extracted from the ice sheets of Greenland and Antarctica provide scientists with climate data going back more than 800,000 years. As snow accumulates on the ground, it traps air bubbles, volcanic ash, dust and other substances in chronological order. The air bubbles provide a record of what gases were in the atmosphere at different points in time.

Scientists can also estimate the air temperature when the snow fell by measuring the ratio of two oxygen isotopes, oxygen-18 and oxygen-16. Generally, ice containing a lot of oxygen-18 was formed in warmer temperatures, with higher oxygen-16 levels indicating colder temperatures. Plotting a graph of temperature versus carbon dioxide concentration reveals a correlation between the two.

The same proxy records show that human activities have influenced the climate for more than a century. Researchers use the term ‘anthropogenic climate change‘ to describe climate change caused by human actions. Temperature records show that the planet warmed throughout the twentieth century. In 2010, the global average surface temperature was the second warmest on record, registering 0.96 °C above the twentieth-century average.

One degree may not seem like a lot, but this increase is a global average and the warming is not uniformly distributed across the planet; it is higher in some areas, including the polar regions. Small temperature changes in the Arctic are amplified through a system of positive feedbacks so a small increase in temperature leads to further warming in the Arctic.

And the ABC program, The World Today, noted previous forecasts, including those published by the IPCC, may have underestimated the the increase in mean global temperatures by influence of Greenhouse Gases. David Mark reports:

One of the certainties about predicting climate change is uncertainty, which is why climate change professor David Frame and 26 of his colleagues from around the world have tried to narrow things down.

“We set out to look at how a large range of climate models could try to span a range of uncertainties to try to get a better handle on the sort of range of plausible climates we might see in the next half century and beyond,” said Professor Frame, who works at the Victoria University of Wellington.

“Generally people build a model and they spend a lot of resources on doing so and they try to make it as good as they can. But when everybody tries for their best-shot model, it doesn’t necessarily mean that you systematically explore all the possible uncertainties.”

When it comes to climate change, there many variables including cloud cover, ocean temperatures and land temperatures. So Professor Frame and his colleagues took one of the world’s best-known climate models and tweaked some of the parameters.

They then asked 10,000 people around the world to run these new models through their home computers, assuming nothing is done to reduce greenhouse gas emissions.

“If people keep emitting fossil fuels in the way we expect, with no price on carbon or no future policy initiatives, we expect a range of 1.4 to 3 degrees by 2050,” he said.

Those numbers are based on average temperatures between 1960 and 1990.

At the bottom end it is similar to the last prediction made by the IPCC, but it exceeds that group’s prediction at the higher end.

“What we’ve kind of got is just a broader sweep of that uncertainty range,” Professor Frame said.

“So it’s not just about the headline result numbers, it’s actually about the physical understanding we can get to with this sort of approach.”

He says the world is most probably somewhere in the middle range rather than at the extremes.

“But it makes me think that people who are thinking about real-world problems, farmers, wine growers in Australia, people managing river catchments for instance, might want to have a look at some of these models to think about what … might plausibly happen, what sorts of changes they might plausibly have to manage for,” he said.

Forecasting the climate is more difficult and fraught than the weather forecast. The time lines are longer, there are more variables to consider, including the multiplying effects of feedback in a multivariate system. Still, while the assumptions may prove incorrect, the attempts were brave. Here is one summary:

The recent rise in temperature over the last century of around 0.75C is different to previous climatic change because it involves a new forcing: greenhouse gas emissions from human activity, which have driven atmospheric CO2 to its highest level for 15m years. Using climate models, scientists have shown that natural forcings alone can’t account for the rise in average global temperature. But the temperature rise does fit with what scientists would expect from a mixture of natural and human forcings.

The rate at which humans are releasing CO2 into the atmosphere is exceptional. Over the last 15m years atmospheric CO2 has regularly risen and fallen – due to natural processes – between 180 and 280 parts per million (ppm). Each time carbon dioxide increased by 100ppm it took between 5,000 and 20,000 years. In contrast, it has taken only 120 years since the Industrial Revolution for atmospheric carbon dioxide to increase by around 100ppm due to human activity.

Postscript:

Of course, I talk with taxi drivers and other people, so I am aware that not everybody goes to the many available scientific sources on climate change. These people, much like me, get their news from the media. Elaine McKewon’s article, “Think tanks talking points deepen the divide over climate change” originally published in The Conversation (via Blogotariat) suggests that “fantasy themes” were propagated by the Institute of Public Affairs. Elaine McKewon summarizes the her research. In essence the method applied by these propagandists is extremely dangerous and violent. You create an enemy by doing so generates implicit violence and de-legitimates democratic debate. So when, for example, Richard Falk observes there is no debate in the US that is no coincidence. It has been engineered by the use of these methods.

Paul Norton at Larvatus Prodeo refers to the Planet Under Pressure conference in London.

Comments»

1. STATE OF THE CLIMATE « DUCKPOND - March 29, 2012

[…] Underestimating Climate Change (wmmbb.wordpress.com) […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: