Thursday, March 06, 2008

Lunar Eclipse Prompts Climate Change Debate

Dust free atmosphere may be responsible for up to .36 F rise in global temps. One more thing that is not properly accounted for in the "models"

Anyone who saw the lunar eclipse last month likely noted that it was relatively bright, with the darkened moon illuminated by ghostly red light. Now that same light is leading some scientists to questions about recent climate change data, according to New Scientist.

A relatively bright eclipse means that the Earth's atmosphere is comparatively free of volcanic dust, and that relatively large amounts of sunlight are being refracted through the Earth's atmosphere. Last week's was rated a 3 on a scale of 0 to 4, meaning that it was very bright indeed. Nor is this the first time - for the last dozen years, eclipses have been relatively luminous, as a result of few dust-spewing eruptions.

Scientists at the University of Colorado, Boulder are drawing some controversial conclusions from those bright lunar surfaces, however. They say that the lack of observed dust in the atmosphere over the past 12 years could be responsible for as much as a .1 to .2 degree Celsius rise (about .18 to .36 degrees Fahrenheit) in the average temperature on Earth since the 1960s. That certainly wouldn't account for the entire range of observed temperature shifts (the average temperature in that time has risen by about .6 Celsius, or 1.08 Fahrenheit degrees) - but if true, it could complicate global climate change analyses.

Other scientists say this idea doesn't hold water. New Scientists quotes one of the scientists on the UN-affiliated Intergovernmental Panel on Climate Change (IPCC) as saying that their recent report, which attributed virtually all recent warming to man made greenhouse gases, had taken the issue of volcanic dust into account. In fact, the IPCC scientists said, haze levels have been slightly higher over the past 40 years compared to the previous 20 years, so the overall volcano-related temperature trend should have been towards cooling.

Source





IS THERE GLOBAL WARMING?

Not all Leftists are totally credulous. Below is an excerpt from a Leftist physicist. From what he sees, he concludes that global warming is a red herring to divert attention from real problems

Before `climate chaos' became cliche, many scientists advanced evidence for detected amounts of global average Earth surface temperature increases occurring in the post-industrial age. These reports, taken as a whole, were the main original catalysts towards constructing the global warming myth, so it is useful to critically examine their validity.

It was no easy task to arrive at the most cited original estimated rate of increase of the mean global surface temperature of 0.5 C in 100 years. As with any evaluation of a global spatio-temporal average, it involved elaborate and unreliable grid size dependent averages. In addition, it involved removal of outlying data, complex corrections for historical differences in measurement methods, measurement distributions, and measurement frequencies, and complex normalisations of different data sets - for example, land based and sea based measurements and the use of different temperature proxies that are in turn dependent on approximate calibration models. Even for modern thermometer readings in a given year, the very real problem of defining a robust and useful global spatio-temporal average Earth-surface temperature is not solved, and is itself an active area of research.

This means that determining an average of a quantity (Earth surface temperature) that is everywhere different and continuously changing with time at every point, using measurements at discrete times and places (weather stations), is virtually impossible; in that the resulting number is highly sensitive to the chosen extrapolation method(s) needed to calculate (or rather approximate) the average.

Averaging problems aside, many tenuous approximations must be made in order to arrive at any of the reported final global average temperature curves. For example, air temperature thermometers on ocean-going ships have been positioned at increasing heights as the sizes of ships have increased in recent history. Since temperature decreases with increasing altitude, this altitude effect must be corrected. The estimates are uncertain and can change the calculated global warming by as much as 0.5 C, thereby removing the originally reported effect entirely.

Similarly, surface ocean temperatures were first measured by drawing water up to the ship decks in cloth buckets and later in wooden buckets. Such buckets allow heat exchange in different amounts, thereby changing the measured temperature. This must be corrected by various estimates of sizes and types of buckets. These estimates are uncertain and can again change the resulting final calculated global warming value by an amount comparable to the 0.5 C value. There are a dozen or so similar corrections that must be applied, each one able to significantly alter the outcome.

In wanting to go further back in time, the technical problems are magnified. For example, when one uses a temperature proxy, such as the most popular tree ring proxy, instead of a physical thermometer, one has the significant problem of calibrating the proxy. With tree rings from a given preferred species of tree, there are all kinds of unavoidable artefacts related to wood density, wood water content, wood petrifaction processes, season duration effects, forest fire effects, extra-temperature biotic stress effects (such as recurring insect infestations), etc. Each proxy has its own calibration and preservation problems that are not fully understood.

The reported temperature curves should therefore be seen as tentative suggestions that the authors hope will catalyze more study and debate, not reliable results that one should use in guiding management practice or in deducing actual planetary trends. In addition, the original temperature or proxy data is usually not available to other research scientists who could critically examine the data treatment methods; nor are the data treatment methods spelled out in enough detail. Instead, the same massaged data is reproduced from report to report rather than re-examined.

The most recent thermometer measurements have their own special problems, not the least of which is urban warming, due to urban sprawl, which locally affects weather station mean temperatures and wind patterns: Temperatures locally change because local surroundings change. Most weather monitoring stations are located, for example, near airports which, in turn, are near expanding cities.

As a general rule in science, if an effect is barely detectable, requires dubious data treatment methods, and is sensitive to those data treatment methods and to other approximations, then it is not worth arguing over or interpreting and should not be used in further deductions or extrapolations. The same is true in attempting to establish causal relationships.

Source






Weather Channel Founder Blasts Network; Claims It Is 'Telling Us What to Think'

TWC founder and global warming skeptic advocates suing Al Gore to expose 'the fraud of global warming.'

The Weather Channel has lost its way, according to John Coleman, who founded the channel in 1982. Coleman told an audience at the 2008 International Conference on Climate Change on March 3 in New York that he is highly critical of global warming alarmism. "The Weather Channel had great promise, and that's all gone now because they've made every mistake in the book on what they've done and how they've done it and it's very sad," Coleman said. "It's now for sale and there's a new owner of The Weather Channel will be announced - several billion dollars having changed hands in the near future. Let's hope the new owners can recapture the vision and stop reporting the traffic, telling us what to think and start giving us useful weather information."

The Weather Channel has been an outlet for global warming alarmism. In December 2006, The Weather Channel's Heidi Cullen argued on her blog that weathercasters who had doubts about human influence on global warming should be punished with decertification by the American Meteorological Society.

Coleman also told the audience his strategy for exposing what he called "the fraud of global warming." He advocated suing those who sell carbon credits, which would force global warming alarmists to give a more honest account of the policies they propose.

"[I] have a feeling this is the opening," Coleman said. "If the lawyers will take the case - sue the people who sell carbon credits. That includes Al Gore. That lawsuit would get so much publicity, so much media attention. And as the experts went to the media stand to testify, I feel like that could become the vehicle to finally put some light on the fraud of global warming."

Earlier at the conference Lord Christopher Monckton, a policy adviser to former Prime Minister Margaret Thatcher, told an audience that the science will eventually prevail and the "scare" of global warming will go away. He also said the courts were a good avenue to show the science.

Source





The World Has Plenty of Oil

Many energy analysts view the ongoing waltz of crude prices with the mystical $100 mark -- notwithstanding the dollar's anemia -- as another sign of the beginning of the end for the oil era. "[A]t the furthest out, it will be a crisis in 2008 to 2012," declares Matthew Simmons, the most vocal voice among the "neo-peak-oil" club. Tempering this pessimism only slightly is the viewpoint gaining ground among many industry leaders, who argue that daily production by 2030 of 100 million barrels will be difficult. In fact, we are nowhere close to reaching a peak in global oil supplies

Given a set of assumptions, forecasting the peak-oil-point -- defined as the onset of global production decline -- is a relatively trivial problem. Four primary factors will pinpoint its exact timing. The trivial becomes far more complex because the four factors -- resources in place (how many barrels initially underground), recovery efficiency (what percentage is ultimately recoverable), rate of consumption, and state of depletion at peak (how empty is the global tank when decline kicks in) -- are inherently uncertain.

- What are the global resources in place? Estimates vary. But approximately six to eight trillion barrels each for conventional and unconventional oil resources (shale oil, tar sands, extra heavy oil) represent probable figures -- inclusive of future discoveries. As a matter of context, the globe has consumed only one out of a grand total of 12 to 16 trillion barrels underground.

- What percentage of global resources is ultimately recoverable? The industry recovers an average of only one out of three barrels of conventional resources underground and considerably less for the unconventional.

This benchmark, established over the past century, is poised to change upward. Modern science and unfolding technologies will, in all likelihood, double recovery efficiencies. Even a 10% gain in extraction efficiency on a global scale will unlock 1.2 to 1.6 trillion barrels of extra resources -- an additional 50-year supply at current consumption rates.

The impact of modern oil extraction techniques is already evident across the globe. Abqaiq and Ghawar, two of the flagship oil fields of Saudi Arabia, are well on their way to recover at least two out of three barrels underground -- in the process raising recovery expectations for the remainder of the Kingdom's oil assets, which account for one quarter of world reserves.

Are the lessons and successes of Ghawar transferable to the countless struggling fields around the world -- most conspicuously in Venezuela, Mexico, Iran or the former Soviet Union -- where irreversible declines in production are mistakenly accepted as the norm and in fact fuel the "neo-peak-oil" alarmism? The answer is a definitive yes.

Hundred-dollar oil will provide a clear incentive for reinvigorating fields and unlocking extra barrels through the use of new technologies. The consequences for emerging oil-rich regions such as Iraq can be far more rewarding. By 2040 the country's production and reserves might potentially rival those of Saudi Arabia.

Paradoxically, high crude prices may temporarily mask the inefficiencies of others, which may still remain profitable despite continuing to use 1960-vintage production methods. But modernism will inevitably prevail: The national oil companies that hold over 90% of the earth's conventional oil endowment will be pressed to adopt new and better technologies.

- What will be the average rate of crude consumption between now and peak oil? Current daily global consumption stands around 86 million barrels, with projected annual increases ranging from 0% to 2% depending on various economic outlooks. Thus average consumption levels ranging from 90 to 110 million barrels represent a reasonable bracket. Any economic slowdown -- as intimated by the recent tremors in the global equity markets -- will favor the lower end of this spectrum.

This is not to suggest that global supply capacity will grow steadily unimpeded by bottlenecks -- manpower, access, resource nationalism, legacy issues, logistical constraints, etc. -- within the energy equation. However, near-term obstacles do not determine the global supply ceiling at 2030 or 2050. Market forces, given the benefit of time and the burgeoning mobility of technology and innovation across borders, will tame transitional obstacles.

- When will peak oil arrive? This widely accepted tipping point -- 50% of ultimately recoverable resources consumed -- is largely a tribute to King Hubbert, a distinguished Shell geologist who predicted the peak oil point for the U.S. lower 48 states. While his timing was very good (he forecast 1968 versus 1970 in fact), he underestimated peak daily production (9.5 million barrels actual versus eight million estimated).

But modern extraction methods will undoubtedly stretch Hubbert's "50% assumption," which was based on Sputnik-era technologies. Even a modest shift -- to 55% of recoverable resources consumed -- will delay the onset by 20-25 years. Where do reasonable assumptions surrounding peak oil lead us? My view, subjective and imprecise, points to a period between 2045 and 2067 as the most likely outcome.

Cambridge Energy Associates forecasts the global daily liquids production to rise to 115 million barrels by 2017 versus 86 million at present. Instead of a sharp peak per Hubbert's model, an undulating, multi-decade long plateau production era sets in -- i.e., no sudden-death ending.

The world is not running out of oil anytime soon. A gradual transitioning on the global scale away from a fossil-based energy system may in fact happen during the 21st century. The root causes, however, will most likely have less to do with lack of supplies and far more with superior alternatives. The overused observation that "the Stone Age did not end due to a lack of stones" may in fact find its match.

The solutions to global energy needs require an intelligent integration of environmental, geopolitical and technical perspectives each with its own subsets of complexity. On one of these -- the oil supply component -- the news is positive. Sufficient liquid crude supplies do exist to sustain production rates at or near 100 million barrels per day almost to the end of this century.

Technology matters. The benefits of scientific advancement observable in the production of better mobile phones, TVs and life-extending pharmaceuticals will not, somehow, bypass the extraction of usable oil resources. To argue otherwise distracts from a focused debate on what the correct energy-policy priorities should be, both for the United States and the world community at large.

Source




Cool And The Gang

Funny thing about ice: It melts in summer and thickens in winter. And according to Gilles Kangis, a senior forecaster with the Canadian Ice service in Ottawa, this Arctic winter has been so severe that the continent's allegedly vanishing ice is 10 to 20 centimeters thicker than it was at this time a year ago. Polar bear Knut, shown here snapping at a child at the Berlin Zoo, is no longer so cuddly. But at least the ice floes that supposedly have been stranding his fellow bears in the Arctic are thickening up this winter.

Polar bear Knut, shown below snapping at a child at the Berlin Zoo, is no longer so cuddly. But at least the ice floes that supposedly have been stranding his fellow bears in the Arctic are thickening up this winter.

Recent satellite images, moreover, show the polar ice cap is at near-normal coverage levels, according to Josefino Comiso, a senior research scientist with the Cryospheric Sciences Branch of NASA's Goddard Space Flight Center.

This winter has been particularly severe. The U.S. National Climatic Data Center (NCDC) reports that many American cities and towns suffered record cold temperatures in January and early February. The average temperature in January "was 0.3 (degrees) F cooler than the 1901-2000 average," the NCDC says.

Ontario and Quebec have experienced major snow and ice storms. In the first two weeks of February, as Canada's National Post reports, Toronto got 79 centimeters of snow, "smashing the record of 66.6 cm for the entire month set back in the pre-SUV, pre-Kyoto, pre-carbon footprint days of 1950."

This is a consequence of what we recently commented on: The sun, the greatest influence on earth's climate, seems to be entering an unusually quiet cycle of limited sunspot activity. As Kenneth Tapping of Canada's National Research Council warns, we may be in for severely cold weather if sunspot activity doesn't pick up.

Tapping oversees the operation of a 60-year-old radio telescope that he calls a "stethoscope for the sun." The last time the sun was this quiet, Earth suffered the Little Ice Age, which lasted five centuries and ended in 1850. The winter at Valley Forge, a famous part of history, occurred during this period.

It's a good time, therefore, for some of the best climate scientists in the world to be gathering in New York City - setting for the Al Gore-promoted doomsday flick "The Day After Tomorrow" - for the 2008 International Conference on Climate Change hosted by the Heartland Institute.

More than 550 climate scientists, economists and public policy experts are at March 2-4 event, their very presence shattering Gore's myth of a warming "consensus" and a debate that is over. Yet because of the media's embrace of Gore's crusade, this may be one of the few places you read about the conference.

The keynoter, Dr. Patrick Michaels of the Cato Institute and the University of Virginia, debunked claims of "unprecedented" melting of Arctic ice. He showed how Arctic temperatures were warmer during the 1930s and that the vast majority of Antarctica is cooling.

President Vaclav Klaus of the Czech Republic is scheduled to speak Tuesday. Klaus, who knows what it is to live under a mindless tyranny, thinks he knows the motives of warm-mongers like Gore. He sees an eerie similarity between communism and what he calls the global warming "religion." In the June 14 Financial Times he wrote: "As someone who lived under communism for most of his life, I feel obliged to say that I see the biggest threat to freedom, democracy, the market economy and prosperity now in ambitious environmentalism, not communism. This ideology wants to replace the free and spontaneous evolution of mankind by a sort of central (now global) planning." If Marx and Lenin were alive today, they'd be environmentalists.

Source

***************************************

For more postings from me, see TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, DISSECTING LEFTISM, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For times when blogger.com is playing up, there are mirrors of this site here and here.

*****************************************

1 comment:

DANIELBLOOM said...

http://climatejokeawards.blogspot.com/