Saturday, January 14, 2012

Ozone holers expect a cooling Arctic!

What happened to all that melting that was supposed to be going on in the Arctic? Ya gotta laugh! To explain why the Montreal ban on CFC's is not stopping ozone depeletion, they let global cooling in the door
GEOPHYSICAL RESEARCH LETTERS, VOL. 38, L24814, 5 PP., 2011
doi:10.1029/2011GL049784

Arctic winter 2010/2011 at the brink of an ozone hole

B.-M. Sinnhuber et al.

Abstract

The Arctic stratospheric winter of 2010/2011 was one of the coldest on record with a large loss of stratospheric ozone.

Observations of temperature, ozone, nitric acid, water vapor, nitrous oxide, chlorine nitrate and chlorine monoxide from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) onboard ENVISAT are compared to calculations with a chemical transport model (CTM). There is overall excellent agreement between the model calculations and MIPAS observations, indicating that the processes of denitrification, chlorine activation and catalytic ozone depletion are sufficiently well represented. Polar vortex integrated ozone loss reaches 120 Dobson Units (DU) by early April 2011. Sensitivity calculations with the CTM give an additional ozone loss of about 25 DU at the end of the winter for a further cooling of the stratosphere by 1 K, showing locally near-complete ozone depletion (remaining ozone <200 ppbv) over a large vertical extent from 16 to 19 km altitude.

In the CTM a 1 K cooling approximately counteracts a 10% reduction in stratospheric halogen loading, a halogen reduction that is expected to occur in about 13 years from now. These results indicate that severe ozone depletion like in 2010/2011 or even worse could appear for cold Arctic winters over the next decades if the observed tendency for cold Arctic winters to become colder continues into the future.

SOURCE







Soot making a comeback in reverse: -- Now a warmer and not a cooler

The soot man has been saying this for a long time but the conventional view used to be that soot had a shading effect

Der Spiegel reports today that scientists have identified soot (black carbon) as one of the major global warmers out there.

According to Science Journal here, a team of 24 experts led by NASA scientist Drew Shindell looked at 400 emission control measures and identified 14 measures targeting methane and black carbon (BC) emissions that would reduce projected global mean warming.

Recently scientists and activists have been frustrated by the slow progress and dogged reluctance by countries to cap CO2 emissions, which are thought to be causing global warming. So Shindell looked for alternative ways to avert warming. Suddenly, lo and behold, soot (BC) and methane have emerged as major global warming factors. The amount they admit soot and methane contribute to warming is in my view astonishing. The abstract states:

"We considered ~400 emission control measures to reduce these pollutants by using current technology and experience. We identified 14 measures targeting methane and BC emissions that reduce projected global mean warming ~0.5°C by 2050.”

This equals the total amount of warming we’ve seen in the last 40 years!

Now scientists are telling us that soot and methane will have the same effect that CO2 is claimed to have had over the last 40 years? Whatever happened to the assertion that man-made CO2 has caused 95% of the warming over the last decades? Obviously CO2 as a driver is seriously getting cut down to size. Throw in the emerging solar effects and there isn’t much left for poor old CO2.

The abstract continues:

"This strategy avoids 0.7 to 4.7 million annual premature deaths from outdoor air pollution and increases annual crop yields by 30 to 135 million metric tons due to ozone reductions in 2030 and beyond. Benefits of methane emissions reductions are valued at $700 to $5000 per metric ton, which is well above typical marginal abatement costs (less than $250). The selected controls target different sources and influence climate on shorter time scales than those of carbon dioxide–reduction measures. Implementing both substantially reduces the risks of crossing the 2°C threshold.”

No need to worry any longer about a doubling of CO2 concentrations. Indeed CO2 as a driver and its hypothesized positive feedbacks simply aren’t materializing. We haven’t seen any warming in 15 years. Now scientists are realizing that soot is a big league player.

Der Spiegel writes:

"About 3 billion people prepare their meals over open fires that burn wood, dung or coal, and thus emit huge amounts of soot. However attempts to get people in Africa and Asia to get interested in other cooking devices have often proven to be difficult."

Of course it has been difficult. When idiot bureaucrats attempt (and are successful) to slow down progress, people remain poor and all they have left to burn is wood. But if they promote growth, free markets and development so that poor countries can attain western standards of living, then they will be able to afford to burn cleaner fuels like gas and oil. And if someday they should get really rich, they too will be able to afford wind and solar energy.

SOURCE

Full abstract:
Simultaneously Mitigating Near-Term Climate Change and Improving Human Health and Food Security

Drew Shindell et al

Tropospheric ozone and black carbon (BC) contribute to both degraded air quality and global warming. We considered ~400 emission control measures to reduce these pollutants by using current technology and experience. We identified 14 measures targeting methane and BC emissions that reduce projected global mean warming ~0.5°C by 2050. This strategy avoids 0.7 to 4.7 million annual premature deaths from outdoor air pollution and increases annual crop yields by 30 to 135 million metric tons due to ozone reductions in 2030 and beyond. Benefits of methane emissions reductions are valued at $700 to $5000 per metric ton, which is well above typical marginal abatement costs (less than $250). The selected controls target different sources and influence climate on shorter time scales than those of carbon dioxide–reduction measures. Implementing both substantially reduces the risks of crossing the 2°C threshold.

SOURCE








New paper finds no change in Antarctic snowmelt since measurements began in 1979

Since 91% of the earth's glacial ice is held in Antarctica, this dynamites Warmist scares about rising sea levels

A paper published today in Geophysical Research Letters finds no significant change in Antarctic snowmelt over the entire 31 year period of satellite observations 1979-2010. The paper actually shows a declining trend in snowmelt over the past 31 years, although not statistically significant. Of note, the abstract states,

"other than atmospheric processes likely determine long-term ice shelf stability." Translation: increased CO2 and other 'greenhouse gases' do not threaten stability of the Antarctic ice shelf.


Meltwater volume for the Antarctic continent (top graph) shows a declining (statistically insignificant) trend since satellite observations began in 1979

SOURCE

Full abstract below:
GEOPHYSICAL RESEARCH LETTERS, VOL. 39, L01501, 5 PP., 2012

Insignificant change in Antarctic snowmelt volume since 1979
Key Points


By P. Kuipers Munneke et al.

Surface snowmelt is widespread in coastal Antarctica. Satellite-based microwave sensors have been observing melt area and duration for over three decades. However, these observations do not reveal the total volume of meltwater produced on the ice sheet. Here we present an Antarctic melt volume climatology for the period 1979–2010, obtained using a regional climate model equipped with realistic snow physics. We find that mean continent-wide meltwater volume (1979–2010) amounts to 89 Gt y−1 with large interannual variability (σ = 41 Gt y−1). Of this amount, 57 Gt y−1 (64%) is produced on the floating ice shelves extending from the grounded ice sheet, and 71 Gt y−1 in West-Antarctica, including the Antarctic Peninsula. We find no statistically significant trend in either continent-wide or regional meltwater volume for the 31-year period 1979–2010.

SOURCE







Winning A Climate Bet

Its ability to make consistently accurate predictions is the test of any scientific theory. Below we see that it was a skeptic, not a Warmist who made the accurate prediction

Dr. David Whitehouse

Predictions, Neils Bohr once said, are difficult, especially about the future. They are even more interesting however, when there is money at stake.

In December 2007 I wrote what I thought was quite a straightforward article for the New Statesman pointing out that it was curious that when so many voices were telling us that global warming was out of control, and that the global warming effect dwarfed natural fluctuations, the global annual average temperature hadn’t increased for many years. I wasn’t promoting any particular point of view just describing the data. The New Statesman jumped at it.

It caused quite a storm resulting in an Internet record number of comments that were complimentary by a large majority, although there were some less than supportive remarks. It evidently also caused quite a fuss in the offices of the New Statesman. Realclimate.com responded with, in my view, an unsatisfactory knock-down of my piece based on trend lines, which I had expected. Trend lines, especially of indeterminate length in the presence of noise, can tell you almost anything, and nothing.

The New Statesman environment correspondent Mark Lynas chipped in eventually with, “I’ll be blunt. Whitehouse got it wrong – completely wrong,” after saying he was initially reluctant to comment. He reproduced Realclimate.com’s trendlines argument and accused me of deliberately or otherwise setting out to deceive. It was a scientifically ignorant article which subsequent events, and peer-reviewed literature, emphasise. Moreover, when I asked New Statesman for redress against such an unnecessary, and in my view unprofessional insult, they declined, and stopped answering my emails. In doing so they missed out on an important, though perhaps inconvenient, scientific story.

More or Less

To my surprise interest in my article was worldwide, and eventually the BBC’s radio programme “More or Less” got in touch. The programme is about numbers and statistics and they set up a series of interviews. You can hear the programme here.

Almost at the last minute the programme-makers came up with the idea of a bet. It was for £100 that, using the HadCrut3 data set, there would be no new record set by 2011. It was made between climatologist James Annan and myself. His work involves analysing climatic data and validating climate models. He accepted enthusiastically as he has a perchant for taking on 'sceptics.' The presenter said that if the global temperature didn’t go up in the next few years, “there would be some explaining to do.”

Later today, January 13th, “More or Less” returns to the bet, which I am pleased to say I won, though I note that this bet, or its conclusion, is not yet mentioned on Annan’s Wikipedia entry despite his other climate bet being discussed.

Writing shortly after the wager was placed James Annan said he believed it was a fairly safe bet, though not certain, as the trend since the current warming spell began, around 1980, was upward (showing those same trendlines!) He drew a straight line from 1980 to 2007 and projected it forwards concluding that sometime over the next few years HadCrut3 would rise above its highest point which was in 1998 (a strong El Nino year.)

The problem with this approach is that it destroys all information in the dataset save the gradient of the straight line. In climate terms 30 years is usually held to be the shortest period to deduce trends (though shorter periods are used often if the trend deduced is deemed acceptable) but that is not to say there is not important information on shorter periods such as volcanic depressions, El Nino rises and La Nina dips. Then there are the so-called, poorly understood decadal variations.

My view was that the information in the dataset was important, especially if projecting it forward just a few years when natural variations were clearly dominant. Looking at HadCrut3 it is clear that there isn’t much of an increase in the 1980s, more of an increase in the 1990s, then there is the big 1998 El Nino, followed by no increase in the past decade or so. It therefore seemed far more likely that the temperature would continue what it had been doing in the recent past than revert to an upward trend, in the next few years at least.

My approach was to listen to the data. The approach taken by James Annan was flawed because he didn’t. He imposed a straight line on the data due to theoretical considerations. I always wonder about the wisdom of the approach that uses straight lines in climatic data. Why should such a complex system follow a straight line? Indeed, the rise of HadCrut3 is not a straight line, but the past ten years is, and that in my view is very curious, and highly significant.

Why, I wonder start the linear increase in 1980? Obviously the temperature starts rising then, but why not start the straight line in 1970? The answer is that the temperature is flat between 1970 and 1980. It seems illogical to take notice of flat data at the start of a dataset but totally ignore it at the end!

When a record is not a record

During the recent interview for “More or Less” James Annan said that had other temperature databases been used he would have won. This is a moot point that also strongly reaffirms my stance. In NasaGiss 2010 is the warmest year, with a temperature anomaly of 0.63 deg C, only one hundredth of a degree warmer than 2005, and within a whisker of 2007, 2006, 2002, 2001 and 1998. Given the 0.1 deg C errors even Nasa did not claim 2010 as a record. Technically speaking 2010 was slightly hotter because of a strong El Nino. Otherwise, NasaGiss shows hardly any increase in the past decade.

During the “More or Less” interview the question arose of extending the bet to “double or quits” for the next five years. I was game for it with a proviso. Betting against a record for ten years raises a higher possibility that there might be a statistical fluctuation than betting for five years. Because of this I would like to see two annual datapoints, consecutively more than one sigma above the 2001 – date mean level. After all, that is the minimum statistical evidence one should accept as being an indication of warming. James Annan did not commit to such a bet during the programme.

It just has to start getting warmer soon.

Back in 2007 many commentators, activists and scientists, such as Lynas, said the halt in global temperatures wasn’t real. It is interesting that the Climategate emails showed that the certainty some scientists expressed about this issue in public was not mirrored in private. Indeed, one intemperate activist, determined to shoot my New Statesman article down but unable to muster the simple statistics required to tackle the statistical properties of only 30 data points, asked the University of East Anglia’s Climatic Research Unit and the Met Office, to provide reasons why I was wrong, which they couldn’t.

What was true in 2007 is even more so in 2012. Since 2007 the reality of the temperature standstill has been accepted and many explanations offered for it, more than can possibly be true! We have seen predictions that half of the years between 2009 and 2014 would be HadCrut3 records (a prediction that now can’t possibly come to pass) which was later modified to half of the years between 2010 and 2015 (likewise.) The Met Office predict that 2012 -16 will be on average 0.54 deg C above the HadCrut3 baseline level, and 2017 -2021 some 0.76 deg C higher. Temperatures must go up, and quickly.

So how long must this standstill go on until bigger questions are asked about the rate of global warming? When asked if he would be worried if there was no increase in the next five years James Annan would only say it would only indicate a lower rate of warming! Some say that 15 years is the period for serious questions.

We are already there

In a now famous (though even at the time obvious) interview in 2010 Prof Phil Jones of the University of East Anglia confirmed that there was no statistically significant warming since 1995. There was an upward trend, but it was statistically insignificant, which in scientific parlance equates to no trend at all. In 2011 Prof Jones told the BBC that due to the inclusion of the warmish 2010 there was now a statistically significant increase between 1995 and 2010. Since 2011 was cool it doesn’t take complicated statistics to show that the post 1995 trend by that method of calculation is now back to insignificant, though I don’t expect the BBC to update its story.

The lesson is that for the recent warming spell, the one that begins about 1980, the years of standstill now exceed those with a year-on-year increase. It is the standstill, not the increase, that is now this warm period’s defining characteristic.

The nature of the anthropogenic global warming signal is that, unlike natural fluctuations, it is always additive. Sooner or later, it is argued, it will emerge unambiguously, perhaps at different times in different parts of the world, but it must emerge. Some argue that by the time it does it will already be too late, but that is another debate.

James Annan is keen on a “money markets” approach to forecasting global warming, and bemoans the reticence of so-called climate sceptics to put their money where their mouth is! I hope that his early-stage financial loss won’t be too much of a setback and a deterrence for potential investors, not that I will be among them.

Now that I am joining the ranks of those who have made money out of global warming (or rather the lack of it) I wonder where the smart money will be placed in the future.

SOURCE





Governance of the science

A speech to the House of Lords by Lord Turnbull, on 12th January 2012

The governing narrative for our climate change framework can be summarised as follows. Our planet is not just warming—this is not in dispute—but the rate of warming is projected to accelerate sharply: rather than the increase we have witnessed of less than 1 per cent per century, by the end of this century the planet is projected to be around 3 degrees centigrade hotter, taking the centre of the range. Some time during this century we will pass a 2 degree centigrade threshold, which is portrayed as a tipping point beyond which serious harm to the planet will occur. The main driver of this is man-made CO2 and the principal response must be the almost complete decarbonisation of the economies of the industrial world less than 40 years from now.

This narrative is largely based on the work of the Intergovernmental Panel on Climate Change, so the competence and integrity of the IPCC are of huge importance if it is to drive the massive social and economic changes being advocated. The reliance that one can put on the report of the noble Lord, Lord Stern, is also at issue, since it adopted large parts of the IPCC framework.

Over the last two years, there have been three separate reports on the IPCC. They are: the report by the InterAcademy Council, a collective of the world’s leading scientific academies; the report written by Professor Ross McKitrick, a Canadian professor of economics who for a time served as an expert reviewer for the IPCC’s fourth assessment report; and a book, The Delinquent Teenager Who Was Mistaken for the World’ s Top Climate Expert, written by Donna Laframboise, a Canadian journalist. Although they write from three different perspectives, in different styles, the message is the same: there are serious flaws in the competence, operations and governance of the IPCC.

The reality is a long way from the way that the IPCC describes itself. The IPCC claims that it employs the top scientists in the field; it uses only peer-reviewed material; its staff are independent and impartial; its operations are transparent; its procedures for review are rigorous and free of conflicts of interest; and its role is to present objective scientific advice to policymakers, not to advocate policy responses. None of these claims is true.

There are many instances where it has not employed the top practitioners in the field, and worse, many instances where it has employed researchers who have barely completed their PhDs—and in some cases not even that. There has been substantial use of “grey”—that is, non-peer-reviewed—literature. The IPCC has been extensively infiltrated by scientists from organisations like Greenpeace and WWF. There is no transparency about how its lead authors and reviewers are selected and what their expertise is. It has been obstructive to outsiders seeking information on data sets and working methods. It is resistant to input from those who do not share the house view. It was specifically criticised by the IAC for not giving sufficient weight to alternative views.

Its review procedures are flawed, allowing too much latitude to lead authors in choosing which of its reviewers’ comments to accept or reject. It has allowed lead authors to introduce new material after the review phase has been completed. Its policies on conflict of interest are inadequate. It blatantly adopts an advocacy role rather than confining itself to scientific advice. Its Summary for Policymakers is a serious misnomer. The scientists prepare a draft but this is redrafted in a conclave of representatives from the member Governments, mostly officials from environment departments fighting to get their Ministers’ views reflected. In short, it is a Summary by Policymakers not for Policymakers.

In a pamphlet I wrote last year for the Global Warming Policy Foundation, chaired by the noble Lord, Lord Lawson, I said:

“In my opinion, the IPCC and its current leadership no longer carry the credibility which politicians need if they are going to persuade their citizens to swallow some unpleasant medicine. It is therefore regrettable that the UK Government has taken no steps to find an alternative and more credible source of advice”.

I see no signs that serious reform of the IPCC is on the agenda for the fifth assessment. The IAC specifically recommended that the chair should serve only for one cycle. Meanwhile, Chairman Pachauri doggedly clings on.

In the field of governance, things are not a great deal better in the UK. We have seen a second instalment of the CRU “Climategate” e-mails, which tell us little new but confirm the culture of shiftiness, obstruction and the stifling of debate seen in the first instalment. We still hear from time to time the mantra of, “The science is settled, the debate is over” from politicians and even from some scientists.

Therefore, I was very heartened to hear Professor Brian Cox, the pin-up boy of British science, and his colleague Professor Jeff Forshaw on the “Today” programme recently. Professor Cox said:

“Science is an improvement in our understanding of nature ... There are no absolute truths in science. It’s the only human endeavour where that level of modesty applies”.

Professor Forshaw said:

“We are always trying to improve on the theories we have got ... And we always expect that they are going to be just temporary structures and that they are going to be replaced at some point”.

So let us have no more “the science is settled/the debate is over” nonsense, particularly in the field of climate science, which is so complex and so young.

My view on the Durban conference is that while many of the participants came away disappointed, it was a sensible conclusion—in the words of the noble Lord, Lord Prescott, to “stop the clock” on the emissions issue for a decade—while the science improves and the evidence accumulates, an approach I have heard suggested by the noble Lord, Lord Rees of Ludlow. However, there is good news to report. The Chancellor of the Exchequer has drawn the UK back from its extreme unilateralism, for which he should be congratulated rather than criticised.

Finally, I have a few personal observations. In my pamphlet I wrote that,

“if a technology exists only by virtue of subsidy we only impoverish ourselves by trying to build jobs on such shaky foundations”.

The debacle in the solar sector was, therefore, entirely predictable. My second observation is that if a debate with the same title as today’s had taken place 15 years ago when I became Permanent Secretary at the old Department of the Environment—where I had a very happy year working for the noble Lord, Lord Prescott—it would not have been so dominated by decarbonisation but would have been much more about those aspects of the environment people care deeply about: air and water quality, habitats, birds, forests and the countryside. How sad that the issues have been pushed so far down the agenda, accelerated by the misconceived transfer of climate change from Defra to DECC.

In 40 years engaged on public policy, I have come across a number of cases where there was a strong international consensus among political elites, but for which the intellectual underpinning proved to be weak, as those elites were slow to acknowledge. The first was the so-called Washington Consensus which came to be seen as promoting globalisation with the maximum liberalisation of trade and finance and the minimum of regulation, but it turned out to overestimate the efficiency of markets. I confess that I swallowed that one pretty much whole. The second is the euro, where the European political elite pressed on despite warnings about the internal contradictions of the project and even now, it has yet to acknowledge the full extent of the problem. I never bought into the euro from the start.

Climate change—or more accurately, the current decarbonisation project—is in my view the third. Originally I bought in to the IPCC narrative on the science and its impacts while remaining critical of the policy responses. However, the intellectual certainty is beginning to crumble. In the next 10 years I believe we will see the current narrative replaced by something more sophisticated—perhaps drawing extensively on the work of the noble Lord, Lord Hunt of Chesterton, who will speak shortly—more eclectic, less alarmist and, in Professor Cox’s words, more “modest” in its claims.

SOURCE




EPA regulation of fuel economy: Congressional intent or climate coup?

In May 2010, the Environmental Protection Agency (EPA) issued a rule setting standards for motor vehicle greenhouse gas emissions. By creating these standards, EPA is implicitly regulating fuel economy. Because the rule also obligates EPA to regulate greenhouse gases from stationary sources, the agency is now determining national policy on climate change. EPA has asserted that it is simply implementing the Clean Air Act. But the Clean Air Act was neither designed nor intended to regulate greenhouse gases, and it provides no authority to regulate fuel economy.

Last year, Congress declined to give EPA explicit authority to regulate greenhouse gases when Senate leaders abandoned cap-and-trade legislation. A key selling point for the Waxman-Markey cap-and-trade bill was that it would exempt greenhouse gases from regulation under several Clean Air Act programs.2 If instead of introducing a cap-and-trade bill, Reps. Waxman and Markey had introduced legislation authorizing EPA to do exactly what it is doing now—regulate greenhouse gases through the Clean Air Act as it sees fit—the bill would have been rejected. The notion that Congress gave EPA such authority in 1970, almost two decades before global warming emerged as a public concern, and five years before Congress enacted the first fuel economy statute, defies common sense.

SOURCE

***************************************

For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here

*****************************************

No comments: