Monday, August 29, 2016



All Natural… Four New Scientific Publications Show No Detectable Sea Level Rise Effect of CO2

It is widely assumed that sea levels have been rising in recent decades largely in response to anthropogenic global warming. However, due to the inherently large contribution of natural oscillatory influences on sea level fluctuations, this assumption lacks substantiation. Instead, natural factors or internal variability override the detection of an anthropogenic signal and may instead largely explain the patterns in sea level rise in large regions of the global oceans.

Scientists who have recently attempted to detect an anthropogenic signal in regional sea level rise trends have had to admit that there is “no observable sea-level effect of anthropogenic global warming,” or that the “sea level rise pattern does not correspond to externally forced anthropogenic sea level signal,” and that sea level “trends are still within the range of long-term internal decadal variability.”

Below are highlighted summaries from 4 peer-reviewed scientific papers published within the last few months.

1. Hansen et al., 2016

For the convenience of the readers, our basic results are shown in Figure 1. We identified five individual oscillations (upper panel), including a sea-level amplitude of 70 mm (top–bottom [t-b]) of the 18.6-year oscillation caused by the lunar nodal oscillation (LNO) … Together with a general sea-level rise of 1.18 mm/y, the sum of these five sea-level oscillations constitutes a reconstructed or theoretical sea-level curve of the eastern North Sea to the central Baltic Sea (Figure 1, lower panel), which correlates very well with the observed sea-level changes of the 160-year period (1849–2009), from which 26 long tide gauge time series are available from the eastern North Sea to the central Baltic Sea.  Such identification of oscillators and general trends over 160 years would be of great importance for distinguishing long-term, natural developments from possible, more recent anthropogenic sea-level changes. However, we found that a possible candidate for such anthropogenic development, i.e. the large sea-level rise after 1970, is completely contained by the found small residuals, long-term oscillators, and general trend. Thus, we found that there is (yet) no observable sea-level effect of anthropogenic global warming in the world’s best recorded region.

2. Palanisamy, 2016

Building up on the relationship between thermocline and sea level in the tropical region, we show that most of the observed sea level spatial trend pattern in the tropical Pacific can be explained by the wind driven vertical thermocline movement. By performing detection and attribution study on sea level spatial trend patterns in the tropical Pacific and attempting to eliminate signal corresponding to the main internal climate mode, we further show that the remaining residual sea level trend pattern does not correspond to externally forced anthropogenic sea level signal. In addition, we also suggest that satellite altimetry measurement may not still be accurate enough to detect the anthropogenic signal in the 20-year tropical Pacific sea level trends.

3. Hadi Bordbar et al., 2016

The tropical Pacific has featured some remarkable trends during the recent decades such as an unprecedented strengthening of the Trade Winds, a strong cooling of sea surface temperatures (SST) in the eastern and central part, thereby slowing global warming and strengthening the zonal SST gradient, and highly asymmetric sea level trends with an accelerated rise relative to the global average in the western and a drop in the eastern part. These trends have been linked to an anomalously strong Pacific Walker Circulation, the major zonal atmospheric overturning cell in the tropical Pacific sector, but the origin of the strengthening is controversial. Here we address the question as to whether the recent decadal trends in the tropical Pacific atmosphere-ocean system are within the range of internal variability, as simulated in long unforced integrations of global climate models. We show that the recent trends are still within the range of long-term internal decadal variability.

4. Dangendorf et al., 2016

The observed 20th century sea level rise represents one of the major consequences of anthropogenic climate change. However, superimposed on any anthropogenic trend there are also considerable decadal to centennial signals linked to intrinsic natural variability in the climate system. … Gravitational effects and ocean dynamics further lead to regionally varying imprints of low frequency variability. In the Arctic, for instance, the causal uncertainties are even up to 8 times larger than previously thought. This result is consistent with recent findings that beside the anthropogenic signature, a non-negligible fraction of the observed 20th century sea level rise still represents a response to pre-industrial natural climate variations such as the Little Ice Age.

SOURCE





Don’t bee-lieve the latest bee-pocalypse scare

Now wild bee junk science and scare stories drive demands for anti-pesticide regulations

Paul Driessen

As stubborn facts ruin their narrative that neonicotinoid pesticides are causing a honeybee-pocalypse, environmental pressure groups are shifting to new scares to justify their demands for “neonic” bans.

Honeybee populations and colony numbers in the United States, Canada, Europe, Australia and elsewhere are growing. It is also becoming increasingly clear that the actual cause of bee die-offs and “colony collapse disorders” is not neonics, but a toxic mix of predatory mites, stomach fungi, other microscopic pests, and assorted chemicals employed by beekeepers trying to control the beehive infestations.

Naturally, anti-pesticide activists have seized on a recent study purporting to show that wild bee deaths in Britain have been correlated with neonic use in oil seed rape fields (canola is a type of OSR). In a saga that has become all too common in the environmental arena, their claims were amplified by news media outlets that share many activist beliefs and biases – and want to sell more subscriptions and advertising.

(Honeybees represent a small number of species that humans have domesticated and keep in hives, to produce honey and pollinate crops. Many are repeatedly trucked long distances, to pollinate almond and other crops as they flower. By contrast, thousands of species of native or wild bees also flourish across the continents, pollinating plants with no human assistance.)

The recent Center for Ecology and Hydrology study examined wild bee population trends over an 18-year period that ended in 2011. It concluded that there was a strong correlation between population and distribution numbers for multiple species of British wild bees and what study authors called their “measure of neonic dose” resulting from the pesticide, which is used as a seed coating for canola crops.

The study is deeply flawed, at every stage – making its analysis and conclusions meaningless. For example, bee data were collected by amateur volunteers, few of whom were likely able to distinguish among some 250 species of UK wild bees. But if even one bee of any species was identified in a 1-by-1 kilometer area during at least two of the study period’s 18 years, the area was included in the CEH study.

This patchy, inconsistent approach means the database that formed the very foundation for the entire study was neither systematic nor reliable, nor scientific. Some species may have dwindled or disappeared in certain areas due to natural causes, or volunteers may simply have missed them. We can never know.

There is no evidence that the CEH authors ever actually measured neonic levels on bees or in pollen collected from OSR fields that the British wild bees could theoretically have visited. Equally relevant, by the time neonics on seeds are absorbed into growing plant tissue, and finally expressed on flecks of pollen, the levels are extremely low: 1.3–3.0 parts per billion, the equivalent of 1–3 seconds in 33 years.

(Coating seeds ensures that pesticides are incorporated directly into plant tissue – and target only harmful pests that feed on the crops. It reduces or eliminates the need to spray crops, which can kill birds, bats and beneficial insects that are in the fields or impacted by accidental “over-sprays.” Indeed, numerous field studies on two continents have found no adverse effects from neonics on honeybees at the hive level.)

A preliminary U.S. Environmental Protection Agency risk assessment for one common neonic sets the safe level for residues on pollen at 25 ppb. Any observable effects on honeybee colonies are unlikely below that. Perhaps wild bees are more susceptible. However, at least two wild bee species (alfalfa leaf cutters and miner bees) are thriving in areas where OSR/canola fields are widespread, and the CEH study found reduced numbers of certain wild bees that do not collect pollen from oil seed rape.

Perhaps most important, the CEH authors appear to have assumed that any declines in wild bee numbers were due to neonicotinoid pesticides in OSR fields, even at very low doses. They discounted or ignored other factors, such as bee diseases, weather and land use changes.

For instance, scientists now know that parasitic Varroa destructor mites and phorid flies severely affect honeybees; so do the Nosema ceranae gut fungus, tobacco ringspot virus and deformed wing virus. Under certain circumstances, those diseases are known to spread to bumblebees and other wild bees.

Significant land development and habitat losses occurred in many parts of Britain from 1930 to 1990, causing wild bee populations to decline dramatically. Thankfully, they have since rebounded – during the same period that neonic use was rising rapidly, replacing older insecticides that clearly are toxic to bees! The CEH team also failed to address those facts.

To compensate for these shortcomings (or perhaps to mask them), the CEH researchers created a sophisticated computer model that supposedly describes and explains the 18 years of wild bee data.

However, as any statistician or modeler knows, models and output are only as good as the assumptions behind them and data fed into them. Garbage in/Garbage out (GIGO) remains the fundamental rule. Greater sophistication simply means more refined refuse, and faster computers simply generate faulty, misleading results more rapidly. They also enable emotional fear-mongering to trump real science.

The CEH models are essentially “black boxes.” Key components of their analytical methodologies and algorithms have not been made public and thus cannot be verified by independent reviewers.

However, the flawed data gathering, unjustified assumptions about neonic impacts, and failure to consider the likely effects of multiple bee diseases and parasites make it clear that the CEH model and conclusions are essentially worthless – and should not be used to drive or justify pesticide policies and regulations.

As Prime Minister Jim Hacker quipped in the theatrical version of the British comedy series Yes, Prime Minister: “Computer models are no different from fashion models. They’re seductive, unreliable, easily corrupted, and they lead sensible people to make fools of themselves.”

And yet studies like this constantly make headlines. That’s hardly surprising. Anti-pesticide campaigners have enormous funding and marvelous PR instincts. Researchers know their influence and next grant can depend on issuing studies that garner alarmist headlines and reflect prevailing news themes and imminent government actions. The news media want to sell ads and papers, and help drive public policy-making.

The bottom line is fundamental: correlation does not equal causation. Traffic lights are present at many intersections where accidents occur; but that does not mean the lights caused most or all of the accidents. The CEH authors simply do not demonstrate that a neonic-wild bee cause-effect relationship exists.

The price to society includes not just the countless dollars invested in useless research, but tens of billions in costs inflicted by laws and regulations based on or justified by that research. Above all, it can lead to “cures” that are worse than the alleged diseases: in this case, neonic bans would cause major crop losses and force growers to resort to older pesticides that clearly are harmful to bees.

There is yet another reason why anti-pesticide forces are focusing now on wild bees. In sharp contrast to the situation with honeybees, where we have extensive data and centuries of beekeeper experience, we know very little about the thousands of wild bee species: where they live and forage, what risks they face, even how many there really are. That makes them a perfect poster child for anti-neonic activists.

They can present all kinds of apocalyptic scenarios, knowing even far-fetched claims cannot be disproven easily, certainly not in time to address new public unease amid discussions about a regulatory proposal.

The Center for Ecology and Hydrology study involved seriously defective data gathering and analytical methodologies. More troubling, it appears to have been released in a time and manner calculated to influence a European Union decision on whether to continue or rescind a ban on neonicotinoid pesticides.

Sloppy or junk science is bad enough in and of itself. To use it deliberately, to pressure lawmakers or regulators to issue cures that may be worse than alleged diseases, is an intolerable travesty.

Via email





An update on Germany's "Energiewende"

Germany is still pursuing its goal of shutting down its nuclear plants but refuses to shut down its lignite plants. It is slashing renewable energy subsidies and replacing them with an auction/quota system. Public opposition is delaying the construction of the power lines that are needed to distribute Germany’s renewables generation efficiently. Renewables investment has fallen to levels insufficient to build enough new capacity to meet Germany’s 2020 emissions reduction target. There is  no evidence that renewables are having a detectable impact on Germany’s emissions, which have not decreased since 2009 despite a doubling of renewables penetration in the electricity sector. It now seems certain that Germany will miss its 2020 emissions reduction target, quite possibly by a wide margin. In short, the Energiewende is starting to unravel.

This post discusses the Energiewende’s main problems under five subheadings, starting with arguably the most problematic:

Germany’s emissions are not decreasing:

Electricity sector emissions decreased between 1990, the baseline year, and 1999 but have remained essentially flat since then. Emissions from other sectors decreased between 1990 and 2009 but have also flattened out since then. As a result Germany’s emissions are about the same now as they were in 2009. The increase in renewables generation over this period has clearly not had the desired effect.

The electricity sector presently contributes only about 45% of Germany’s total emissions. 100% decarbonization of the electricity sector, which is already about 45% decarbonized if we add nuclear, would therefore in theory reduce total emissions by only another 25% or so. Yet Germany’s efforts to cut emissions continue to concentrate on the electricity sector.

The chances that Germany will meet its 2020 and 2030 emissions reduction targets do not look good.

Renewables have not reduced emissions

Since 1990 renewable energy generation has grown by a factor of over ten to the point where it now supplies 30% of Germany’s electricity. One would think that this would have had a visible impact on Germany’s electricity sector emissions, but as shown in Figure 3 it’s difficult to detect any impact at all. Despite the 20% absolute increase in renewables penetration between 1999 and 2014 electricity sector emissions have barely changed over this period, and had it not been for the 2008/9 recession they would probably have increased:

The reason renewables have had no detectable impact is that the added generation has gone towards filling increased demand and replacing nuclear generation rather than generation from gas, coal and lignite, which remains about the same as it was in 1990

Finally, Germany will discontinue direct renewable subsidies for new projects at the beginning of 2017. It will be interesting to see what happens to retail electricity rates as a result.

Summary:

Germany is a country of contradictions, at least as far as energy is concerned. Germans are in favor of more renewable energy yet oppose building the overhead power lines that are needed to distribute it. They are in favor of deep emissions cuts but also in favor of shutting down Germany’s nuclear plants, which will make the problem of meeting emissions targets far more difficult and costly. The government continues to pursue a nuclear shutdown but is unwilling to shut down Germany’s lignite plants. As a result of these conflicting and counterproductive viewpoints and policies the Energiewende has effectively gone nowhere. Despite the expenditure of many billions of dollars it has failed to achieve any visible reduction in Germany’s emissions or to make a meaningful difference to Germany’s energy mix (renewables still supply only 14% of Germany’s total energy). Its only demonstrable impact has been skyrocketing electricity bills.

And now Germany is discontinuing the direct renewables subsidies that have driven the Energiewende since its adoption in 2000. It might be premature to declare the Energiewende a failure, but things are certainly headed in that direction.

Much more HERE  (See the original for links, graphics etc.)




The Troubling Science

Michael Hart is a Canadian academic with an impressive list of credentials. He has just put out a book – Hubris: The Troubling Science, Economics, and Politics of Climate Change.

This article covers many of the topics that have been raised here at Blackjay over the last couple of years. It is must-read for anyone with lingering doubts about the supposed urgent need for action on climate change.

For example: Alarm over a changing climate leading to malign results is in many ways the product of the hunger for stability and direction in a post-Christian world. Humans have a deep, innate need for a transcendent authority. Having rejected the precepts of Christianity, people in the advanced economies of the West are turning to other forms of authority. Putting aside those who cynically exploit the issue for their own gain – from scientists and politicians to UN leaders and green businesses – most activists are deeply committed to a secular, statist, anti-human, earth-centric set of beliefs which drives their claims of a planet in imminent danger from human activity.

To them, a planet with fewer people is the ultimate goal, achievable only through centralized direction and control. As philosopher of science Jeffrey Foss points out, “Environmental science conceives and expresses humankind’s relationship to nature in a manner that is – as a matter of observable fact – religious.” It “prophesies an environmental apocalypse. It tells us that the reason we confront apocalypse is our own environmental sinfulness. Our sin is one of impurity. We have fouled a pure, ‘pristine’ nature with our dirty household and industrial wastes. The apocalypse will take the form of an environmental backlash, a payback for our sins. … environmental scientists tell people what they must do to be blameless before nature.”

The interview concludes: it will take a determined effort by people of faith and conscience to convince our political leaders that they have been gulled by a political movement exploiting fear of climate change to push a utopian, humanist agenda that most people would find abhorrent. As it now stands, politicians are throwing money that they do not have at a problem that does not exist in order to finance solutions that make no difference. The time has come to call a halt to this nonsense and focus on real issues that pose real dangers. In a world beset by war, terrorism, and continuing third-world poverty, there are far more important things on which political leaders need to focus.

It may be nitpicking but the one thing I disagree with is his use of the term “humanist” in the final paragraph. Humanism is a philosophical and ethical stance that emphasizes the value and agency of human beings, individually and collectively, and generally prefers critical thinking and evidence over acceptance of dogma or superstition. The utopian agenda is certainly not humanist. Any philosophy in which wilderness has greater value than community, in which humans are seen as a “scourge on the planet” a la Attenborough and which supports the dogma and pseudo-science of climate change is certainly not humanist.

But I agree with him about the rest of it.

SOURCE






The Global Effects of Global Warming on Human Mortality


Paper Reviewed: Guo, Y., Gasparrini, A., Armstrong, B., Li, S., Tawatsupa, B., Tobias, A., Lavigne, E., Coelho, M. de S.Z.S.C., Leone, M., Pan, X., Tong, S., Tian, L., Kim, H., Hashizume, M., Honda, Y., Guo, Y.-L.L., Wu, C.-F., Punnasiri, K., Yi, S.-M., Michelozzi, P., Saldiva, P.H.N. and Williams, G. 2014. Global variation in the effects of ambient temperature on mortality. Epidemiology 25: 781-789.

In a study they designed to determine the effects of daily high and low temperatures on human mortality, Guo et al. (2014) obtained daily temperature and mortality data from 306 communities located in 12 different countries (Australia, Brazil, Thailand, China, Taiwan, Korea, Japan, Italy, Spain, the United Kingdom, the United States and Canada) that fell somewhere within the time span of 1972-2011. And what did they learn from this monumental endeavor?

The 22 researchers, hailing from numerous places throughout the world, report that "to obtain an easily interpretable estimate of the effects of cold and hot temperatures on mortality," they "calculated the overall cumulative relative risks of death associated with cold temperatures (1st percentile) and with hot temperatures (99th percentile), both relative to the minimum-mortality temperature [75th percentile]" (see figure below). And despite the "widely ranging climates" they encountered, they report that "the minimum-mortality temperatures were close to the 75th percentile of temperature in all 12 countries, suggesting that people have adapted to some extent to their local climates."

Once again, therefore, it is as clear as it can possibly be made, that essentially everywhere in the world, typical cold temperatures are far more likely to lead to premature human deaths than are typical warm temperatures. And because of this fact, we must be thankful for the post-Little Ice Age warming of the world, which has been predominantly experienced almost everywhere at the cold -- and deadly -- end of the planet's daily temperature spectrum.

More HERE  (See the original for links, graphics etc.)





Bats Save Billions In Pest Control

And wind turbines kill them by the millions

A secret war is waged above farmland every night.

Just after dusk, high-stakes aerial combat is fought in the darkness atop the crop canopy. Nature’s air force arrives in waves over crop fields, sometimes flying in from 30 miles away. Bat colonies blanket the air with echo location clicks and dive toward insect prey at up to 60 mph. In games of hide-and-seek between bats and crop pests, the bats always win, and the victories are worth billions of dollars to U.S. agriculture.

Bats are a precious, but unheralded friend of farmers, providing consistent crop protection. Take away the colonies of pest killers and insect control costs would explode across farmland. And just how much do bats save agriculture in pesticide use? Globally, the tally may reach a numbing $53 billion per year, according to estimates from the University of Pretoria, U.S. Geological Survey (USGS), University of Tennessee, and Boston University.

A 2006 study proposed bats saved cotton growers $74 per acre in pesticide treatments across eight Texas counties. In 2013-2014, graduate student Josiah Maines and his advisor at Southern Illinois University Carbondale, Justin Boyles, went beyond penciled estimates and ran a concrete field trial to show the relation between bats and corn protection. Funded by Bat Conservation International, Maines’ unique test targeted density of corn earworm in southern Illinois bottomland in Alexander County.

Maines built a canopy system to prevent bats from accessing particular sections of corn at night. The controlled enclosure (65’ by 65’, and 23’ high) was braced by steel structural poles interconnected with steel cables draped by netting. Maines operated the netting like a gigantic shower curtain every day of crop season: Open in daylight and close at night. He kept the vigil over two years, sliding the big curtain at the given dusk hour from May to late September to cut off bat access to earworm moths. The results? Maines was astonished.

He found a 50% reduction in earworm presence in control areas and a similar reduction in damage to corn ears. Not only did bats suppress earworm larvae and direct damage to corn, they also hindered the presence of fungal species and toxic compounds. “Globally, we estimate bats save corn farmers over $1 billion annually in earworm control,” Maines says. “It’s an incredible amount when we’re only considering one pest and one crop. Bats are truly a vital economic species.”

Would producers see greater crop protection with more bat habitat? In general, researchers don’t know how many bats fly over a single acre of farmland at night. Bats are extremely difficult to count during the day. They hide incredibly well in trees, caves, holes in the ground, and buildings. “Future research should look at the tradeoff of forested bat habitat and crop protection. Safe to say, more bats could mean even greater consumption of crop pests,” Maines says.

Paul Cryan, a USGS research biologist at the Fort Collins Science Center, says of up to 45 bat species in the U.S., 41 to 42 eat nothing but insects. “Our U.S. bats are small -- 10 to 20 grams. They have voracious appetites and eat half or all their body weight each night. Pest control value to agriculture is certainly in the billions of dollars per year.”

However, pressing issues surround the future of U.S. bat populations. White nose syndrome (WNS) is a major threat to U.S. bat numbers. The fungal disease affects hibernating bats and has spread halfway across the U.S. since first appearing in New York in 2006. “WNS has killed up to 6 million bats and continues moving,” Cryan says. “I believe farmers would see an immediate impact in insect suppression if overall bat populations were seriously reduced.”

Cryan coauthored a seminal 2011 paper, Economic Importance of Bats in Agriculture, suggesting the loss of bats would cost U.S. agriculture at least $3.7 billion per year. “We’re typically scared of the dark, but bats shouldn’t be a part of that association. They’re such a beneficial and important part of the environment and farmland protection.”

Hat tip to the misunderstood bats of agriculture: phenomenal creatures patrolling farmland skies every night in the greatest show never seen.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************



Sunday, August 28, 2016



Solar activity has a direct impact on Earth's cloud cover

This new paper confirms that solar activity variation can account for a 2% variation in global cloud cover, sufficient to explain the warming of the 20th century and without any consideration of CO2 "radiative forcing."

 A team of scientists from the National Space Institute at the Technical University of Denmark (DTU Space) and the Racah Institute of Physics at the Hebrew University of Jerusalem has linked large solar eruptions to changes in Earth's cloud cover in a study based on over 25 years of satellite observations.

The solar eruptions are known to shield Earth's atmosphere from cosmic rays. However the new study, published in Journal of Geophysical Research: Space Physics, shows that the global cloud cover is simultaneously reduced, supporting the idea that cosmic rays are important for cloud formation. The eruptions cause a reduction in cloud fraction of about 2 percent corresponding to roughly a billion tonnes of liquid water disappearing from the atmosphere.

Since clouds are known to affect global temperatures on longer timescales, the present investigation represents an important step in the understanding of clouds and climate variability.

"Earth is under constant bombardment by particles from space called galactic cosmic rays. Violent eruptions at the Sun's surface can blow these cosmic rays away from Earth for about a week. Our study has shown that when the cosmic rays are reduced in this way there is a corresponding reduction in Earth's cloud cover. Since clouds are an important factor in controlling the temperature on Earth our results may have implications for climate change," explains lead author on the study Jacob Svensmark of DTU.

Very energetic particles

These particles generate electrically charged molecules -- ions -- in Earth's atmosphere. Ions have been shown in the laboratory to enhance the formation of aerosols, which can serve as seeds for the formation of the cloud drops that make up a cloud. Whether this actually happens in the atmosphere, or only in the laboratory is a topic that has been investigated and debated for years.

When the large solar eruptions blow away the galactic cosmic rays before they reach Earth they cause a reduction in atmospheric ions of up to about 20 to -30 percent over the course of a week. So if ions affect cloud formation it should be possible to observe a decrease in cloud cover during events when the Sun blows away cosmic rays, and this is precisely what is done in this study.

The so-called 'Forbush decreases' of the cosmic rays have previously been linked to week-long changes in Earth's cloud cover but the effect has been debated at length in the scientific literature. The new study concludes that "there is a real impact of Forbush decreases on cloud microphysics" and that the results support the suggestion that "ions play a significant role in the life-cycle of clouds."

Arriving at that conclusion was, however, a hard endeavor; Very few strong Forbush decreases occur and their effect on cloud formation is expected to be close to the limit of detection using global atmospheric observations measured by satellites and land based stations. Therefore it was of the greatest importance to select the strongest events for study since they had to have the most easily detected effect. Determining this strength required combining data from about 130 stations in combination with atmospheric modeling.

This new method resulted in a list of 26 events in the period of 1987-2007 ranked according to ionization. This ranked list was important for the detection of a signal, and may also shed some light on why previous studies have arrived at varied conclusions, since they have relied on events that were not necessarily ranked high on the list.

Possible long term effect

The effect from Forbush decreases on clouds is too brief to have any impact on long-term temperature changes.

However since clouds are affected by short term changes in galactic cosmic radiation, they may well also be affected by the slower change in Solar activity that happens on scales from tens to hundreds of years, and thus play a role in the radiation budget that determines the global temperature.

The Suns contribution to past and future climate change may thus be larger than merely the direct changes in radiation, concludes the scientists behind the new study.

SOURCE






Uncovered: Incoherent, Conflicting IPCC ‘Beliefs’ on Climate Sensitivity

This is a long and complex article but it needs to be that so it can set out fully what the detailed scientific claims of Warmists are.  It shows that to get their alleged "catastrophic" levels of warming they rely heavily on an assumption about water vapour in the air having a large magnifying effect on the warming due to CO2 alone.  So how do they work out exactly what the size of that magnifying effect will be?  They don't.  They just guess it.  And the actual  evidence for the size of such an effect is that it has no effect -- JR

For going on 3 decades now, Intergovernmental Panel on Climate Change (IPCC) reports have estimated that the climate’s sensitivity to the doubling of preindustrial levels of CO2 (from 280 ppm to 560 ppm) may range between 1.5°C to 4.5°C due significantly to the assumed “dangerous” warming amplification from positive water vapor feedback.  Despite years of analysis, the factor-of-three difference between the lower and higher surface temperature range thresholds has changed little.  There apparently have been no breakthroughs in understanding the “basic physics” of water vapor amplification to narrow this range further.

The theoretical conceptualization for the surface temperature change resulting from CO2 doubling alone — without the “dangerous” amplification from  water vapor feedback — has also been in use, and unchanged, for decades.  Since the 1960s it has been hypothesized that if preindustrial CO2 levels were to be doubled to 560 ppm, the surface temperature change would amount to a warming of a non-alarming 1.2°C in the absence of other feedbacks.

Below are brief summaries from scientific papers (and the Skeptical Science blog) confirming that the IPCC and models claim doubling CO2 only results in 1.2°C of warming.

IPCC (2001) :

“[T]he radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes”

Skeptical Science :

“We know that if the amount of carbon dioxide (CO2) in the Earth’s atmosphere doubles from the pre-industrial level of 280 parts per million  by volume (ppmv) to 560 ppmv, this will cause an energy imbalance by trapping more outgoing thermal radiation in the atmosphere, enough to directly warm the surface approximately 1.2°C.”

Gebhart, 1967 :

“The temperature change at the earth’s surface is ΔT=+1.2°C when the present [CO2] concentration is doubled.”

Hansen et al., 1981 :

“The increase of equilibrium surface temperature for doubled atmospheric CO2 is ∼1.2°C.  This case is of special interest because it is the purely radiative-convective result, with no feedback effects.”

Lorius et al., 1990 :

“The radiative forcing resulting from doubled atmospheric CO2 would increase the surface and tropospheric temperature by  1.2°C if there were no feedbacks in the climate system.”

Torn and Harte, 2006 :

“An increase in atmospheric CO2 concentration from 275 to 550 ppm is expected to increase radiative forcing by about 4 W m2, which would lead to a direct warming of 1.2°C in the absence of feedbacks or other responses of the climate system”

IPCC: Dangerous future warming levels (3°C and up) are caused mostly by water vapor, not CO2

As mentioned, the IPCC authors have claimed that it is primarily due to the conceptualization of positive feedback with water vapor that the surface temperature response is projected  to reach the dangerous warming levels of 3.0°C and up as CO2 doubles to 560 ppm.

IPCC (2001) :

“The so-called water vapour feedback, caused by an increase in atmospheric water vapour due to a temperature increase, is the most important feedback responsible for the amplification of the temperature increase [from CO2 alone].”

In their 4th report, the IPCC acknowledged that humans have little influence in determining water vapor levels:

IPCC (2007) :

“Water vapour is the most abundant and important greenhouse gas in the atmosphere. However, human activities have only a small direct influence on the amount of atmospheric water vapour.”

The main reason why IPCC authors have asserted that water vapor will do most of the “dangerous” projected warming, while CO2 will contribute a much smaller fraction, is apparently because the greenhouse warming effect from water vapor forcing is “two to three times greater” than that of carbon dioxide:

IPCC (2013) :

“Water vapour is the primary greenhouse gas in the Earth’s atmosphere. The contribution of water vapour to the natural greenhouse effect relative to that of carbon dioxide (CO2) depends on the accounting method, but can be considered to be approximately two to three times greater.”

Even NASA agrees that water vapor and clouds together account for 75% of the greenhouse effect, while CO2 only accounts for 20%.

NASA  :

“Carbon dioxide causes about 20 percent of Earth’s greenhouse effect; water vapor accounts for about 50 percent; and clouds account for 25 percent. The rest is caused by small particles (aerosols) and minor greenhouse gases like methane.”

IPCC: Positive water vapor feedbacks are believed to cause dangerous warming

It is curious to note that the insufficiently understood positive water vapor feedback conceptualization is rooted in . . . belief.  Literally.   In the third report (TAR), the IPCC authors actually used the word “believed” to denote how they reached the conclusion that 1.2°C will somehow morph into 1.5°C to 4.5°C of warming due to amplification from feedbacks.

IPCC (2001) :

“If the amount of carbon dioxide were doubled instantaneously, with everything else remaining the same, the outgoing infrared radiation would be reduced by about 4 Wm-2. In other words, the radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes. In reality, due to feedbacks, the response of the climate system is much more complex. It is believed that the overall effect of the feedbacks amplifies the temperature increase to 1.5 to 4.5°C. A significant part of this uncertainty range arises from our limited knowledge of clouds and their interactions with radiation.”

IPCC climate sensitivity estimates have been based on hypotheticals, or the belief that water vapor positive feedback will cause another 1.8°C to 3.3°C of “extra” or “dangerous” warming (to reach upwards of 3.0°C to 4.5°C).  CO2 alone only causes 1.2°C of warming as it is doubled from 280 ppm to 560 ppm.  Since when are modeled beliefs about what may possibly happen to global temperatures at some point in the next 100 years . . . science?

IPCC: Water vapor increased substantially since 1970 — but didn’t cause warming

If water vapor is the primary determinant of the “extra” and “dangerous” warming we are expected to get along with the modest 1.2°C temperature increase as the CO2 concentration reaches 560 ppm, then it is natural to ask: How much of the warming since 1950 has been caused by the additional CO2, and how much has been caused by the water vapor feedback that is believed to cause the extra, “dangerous” warming?

This last question arises because, according to the IPCC, there has been a substantial increase in the potent water vapor greenhouse gas concentration in the last few decades.  Specifically, in their 4th report, the IPCC authors claim there has been “an overall increase in water vapour of order 5% over the 20th century and about 4% since 1970“(IPCC [2007]).

Considering its abundance in the atmosphere (~40,000 ppm in the tropics), if water vapor increased by 4% since 1970, that means that water vapor concentrations could potentially have increased by more than 1,500 ppm in the last few decades.  The overall magnitude of this water vapor concentration increase is therefore more than 20 times greater than the increase in atmospheric CO2 concentration (~70 ppm) since 1970.

But even though the IPCC claims that (a) water vapor will cause most of the “dangerous” warming in the future, (b) water vapor climate forcing is “two to three” times greater than CO2 forcing within the greenhouse effect, and (c) water vapor concentrations have increased substantially since 1970, the IPCC simultaneously claims that (d) CO2 has caused most — if not all — of the warming since the mid-20th century anyway.   In the 5th report, the IPCC’s “consensus” statement reads like this:

IPCC (2013, 2014) :

“It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.”

For advocates of dangerous anthropogenic global warming (DAGW) projections, the “more than half” CO2 attribution apparently isn’t quantitatively strong enough.  After all, “more than half” could be interpreted as only slightly more than 50%.   To rectify this, Gavin Schmidt  — a primary overseer of NASA temperature adjustments — has calculated that the anthropogenic impact on climate has not  just been “more than half,” but more than 100%.   In a recent RealClimate blog entry, Schmidt  claims that humans have caused 110% of the global warming since 1950 — and that IPCC analysis (found in Fig. 10.5 in IPCC AR5) also supports an anthropogenic CO2 attribution of  “near 100%”.

Real Climate :

“The best estimate of the warming due to anthropogenic forcings (ANT) is the orange bar [in Fig. 10.5] (noting the 1𝛔 uncertainties). Reading off the graph, it is 0.7±0.2ºC (5-95%) with the observed warming 0.65±0.06 (5-95%). The attribution then follows as having a mean of ~110%, with a 5-95% range of 80–130%. This easily justifies the IPCC claims of having a mean near 100%, and a very low likelihood of the attribution being less than 50% (p < 0.0001!).”

Conflicting IPCC climate sensitivity feedback suppositions

The IPCC believes that the climate’s overall surface temperature sensitivity to the doubling of preindustrial CO2 ranges between 1.5°C to 4.5°C, with the projected higher warming levels due primarily to amplifying water vapor feedback.  This conceptualization appears to be in conflict with other IPCC suppositions.

On one hand, the IPCC reports have claimed that (a) water vapor is much more potent than CO2 within the greenhouse effect, that (b) the bulk of the 3.0°C and up “dangerous” warming that is believed to occur in the future will be forced by positive water vapor feedback, and that (c) water vapor  levels have significantly increased in recent decades (by 4% since 1970).

On the other hand, (d) water vapor is claimed to have caused right around 0% of the warming in the last several decades.

Summarily, these conflicting explanations or suppositions about what can happen, what will happen, and what has already happened to the climate due to water vapor feedback beg the questions:

Why hasn’t the “dangerous” water vapor warming found in models “kicked in” during the last several decades, when water vapor levels have increased (according to the IPCC)?

Since it reportedly hasn’t yet, at what point in the future will the “dangerous” water vapor warming projections found in modeling finally show up in the temperature record?

Considering how fundamental climate sensitivity estimates are to climate science, and ultimately to the direction of political policies and energy production and consumption, these questions deserve to be answered . . . with something more substantive than what the IPCC authors have long believed to be true.

SOURCE  





What Obama Is Doing to Seal His Environmental Record

Before his last day in office, President Barack Obama wants to impose new fuel-efficiency standards and establish a green energy plan for North America to top off an environmental legacy including major international agreements and a massive expansion of regulations and subsidies.

“He will be leaving office with a very strongly negative legacy,” predicted Nick Loris, research fellow on energy and environment with The Heritage Foundation, in a phone interview. “After he failed to get a ‘cap and trade’ bill through Congress, he has used unelected bureaucrats to implement and pioneer regulatory onslaught.”

Early in his presidency, Obama and liberals in Congress unsuccessfully proposed financial incentives for companies to reduce carbon emissions, saying such a “cap and trade” approach would help curb global warming

During his weekend address Aug. 13, Obama spoke about “ambitious investments” that led to tripling the use of wind power, increasing the use of solar energy “thirtyfold,” and more energy-efficient vehicles.

“We’re not done yet. In the weeks and months ahead, we’ll release a second round of fuel-efficiency standards for heavy-duty vehicles,” Obama said of his Jan. 20 departure after eight years, adding:

We’ll take steps to meet the goal we set with Canada and Mexico to achieve 50 percent clean power across North America by 2025. And we’ll continue to protect our lands and waters so that our kids and grandkids can enjoy our most beautiful spaces for generations.

‘Little to Mitigate Global Warming’

Three days after that address, the Environmental Protection Agency and the National Highway Traffic Safety Administration formally announced they are adopting new fuel-efficiency standards for heavy-duty vehicles such as tractor-trailers and buses.

This will mark the second time the Obama administration has put new fuel-efficiency standards in place. The White House, in a press release, asserts that 20 percent of carbon pollution comes from heavy-duty vehicles.

Separately, the Energy Department created a new program to spend $140 million on research and development for “fuel-efficient truck technologies.”

This will almost certainly mean higher costs with minimum impact on global warming, Loris said.

“Trucks, buses, and garbage trucks, these are all industries that measure their fuel to a tenth of a mile because energy efficiency is key to their bottom line,” Loris said. “There is little this would do to mitigate global warming. You could shut down the entire economy and the temperature would only move a few degrees Celsius.”

Obama’s other ambitious goal before leaving office was reached during the North American Leaders’ Summit in late June, where Obama met with Mexican President Enrique Peña Nieto and Canadian Prime Minister Justin Trudeau in Ottawa. The plan is to have the three countries operating on 50 percent clean energy by 2025.

Such a goal will be nearly impossible to reach in nine years, said Patrick Michaels, director of the Center for the Study of Science at the libertarian Cato Institute.

“Of course it’s not doable,” Michaels told The Daily Signal in a phone interview. “Even if you substitute nuclear power for fossil fuels, that wouldn’t be enough time to build enough nuclear plants.”

‘Legacy of Unconscionable Costs’

Sticking to the deal will be a challenge, agreed David Kreutzer, a senior research fellow for energy economics and climate change at The Heritage Foundation.

“Whatever the cost, it won’t be incurred by the Obama administration,” Kreutzer said in a phone interview. “He can take on the role of an energy reformer and his successor will have to deal with the lost jobs and high energy prices. The current government of Canada might seem inclined to sign on, but Mexico needs investment and might not want to tie itself into poverty.”

The regulatory costs of environmental regulations artificially raise energy prices, which are typically shouldered by lower-income Americans, according to an analysis by The Heritage Foundation.

A 2011 poll by the National Energy Assistance Directors Association found that 37 percent of low-income families sacrificed medical and dental coverage to pay for higher energy bills. The poll found almost one in five identified a family member who became sick because their home was too cold.

“It’s a legacy of unconscionable costs imposed with no climate impact,” Kreutzer said.

“It’s a legacy of unconscionable costs imposed with no climate impact,” @dwkreutzer says.

The president could have made a larger investment in cutting-edge technologies such as those associated with nuclear power, including fusion research, contends Tony Sadar, a certified consulting meteorologist and author of “In Global Warming We Trust: Too Big to Fail.”

“Progressives are looking at sunbeams and windmills,” Sadar told The Daily Signal in a phone interview. “You’re not really progressive if you’re looking at ancient technologies. Early on, the president supported research into nuclear power generation. But we’ve seen a return to the alternative energy that leaves much to be desired economically and even environmentally.”

‘He Is Doubling Down’

Michaels, of the Cato Institute, said much of the Obama legacy will be the “boondoggles” of solar and wind power along the countryside.

“His long-term legacy will be that he committed this country to sources of power that will never supply much dependable electric power,” Michaels said, adding:

The fact that solar and wind have been subsidized for years shows they are not successful. He makes no attempt to hide the fact that he believes Europe is doing so many wonderful things that we should. If he was consistent on that, he would observe that most of Europe is disengaging from these energy sources, while he is doubling down.

The Daily Signal sought comment from the Sierra Club and Greenpeace, both of which support much of Obama’s environmental agenda, but neither responded by publication time.

Courts have delivered a setback to some of Obama’s environmental agenda.

In February, the Supreme Court blocked EPA rules limiting carbon emissions from power plants. The high court ordered a stay, until more than two dozen lawsuits challenging the regulations can be sorted.

Lower federal courts halted the Interior Department from imposing stricter regulations on hydraulic fracturing, and separately stopped an EPA rule on small waterways and wetlands. The lawsuits and court rulings were based in part on executive overreach.

The United States entered an international climate agreement with 171 other countries negotiated in Paris that is intended to curb carbon emissions that government leaders say contribute to global warming. The governments hammered out the deal last year, and U.S. Secretary of State John Kerry signed the agreement in April.

Though treaties require Senate ratification, negotiators from the Obama administration and other countries worded much of the agreement to allow the measures to be handled by the executive branch.

Scaling Back Taxpayer Subsidies

The Obama administration has scaled back some taxpayer subsidies after spending hundreds of millions on loan guarantees for green energy companies that failed, Loris noted.

Solyndra, the politically connected solar panel company that went bankrupt despite a $500 million Energy Department loan, was the most publicized debacle. But dozens of other companies got taxpayer subsidies.

In congressional testimony, Loris noted the underlying themes of subsidies to green energy companies showed taxpayer money going to failed companies that couldn’t survive even with such help; projects backed by larger companies that should be able to operate without taxpayer help; and numerous other companies that benefit from taxpayer subsidies.

The government surprisingly seems to have learned something from the bad investments, Loris said.

“I haven’t seen new major loan guarantees, though there have been extended tax credits,” Loris told The Daily Signal. “There could be a recognition that government isn’t good at picking winners and losers. Folks will recognize that politicians shouldn’t invest in energy.”

SOURCE




UK: Health warning over plan to use hospital generators to avoid blackouts

In which universe might a plan to use hospital generators to avoid blackouts seem sane?

National Grid’s drive for hospitals to help keep the UK's lights on by using their back-up diesel generators is "highly questionable" because it will cause air pollution right in the vicinity of patients, a think-tank has warned.

The energy utility is encouraging NHS sites to sign up for schemes where they will be paid to use their back-up generators for electricity routinely, not just in the event of an emergency power cut.

National Grid argues that making greater use of these existing generators represents a cost-effective way of helping to meet peak UK power demand as the country builds more intermittent wind and solar, instead of building new power plants that would sit dormant much of the time.

But Policy Exchange has urged the Government to restrict the use of such diesel generators beyond genuine emergency back-up because of concerns about air quality, especially in urban areas that are already polluted.

Diesel generators emit significant amounts of nitrogen oxides and particulate matter which can be "extremely damaging to health", it warns.

"National Grid has been actively recruiting hospitals and other organisations to make back-up generators available at peak times and avoid blackouts.

"Whilst this is desirable from a security of supply point of view, it is highly questionable from an air quality point of view – particularly since hospitals are typically located in urban locations close to some of the most sensitive receptors," Richard Howard, Policy Exchange’s head of energy and environment wrote.

Mr Howard said he had even heard of "generator flues venting directly into car parks and communal areas in hospitals used by patients".

Ministers are currently considering how to curb the growth in diesel generators, dozens of which are being built around the country after becoming the unintended beneficiaries of the Government’s "capacity market" subsidy scheme, which procures power plant capacity.

The environment department is considering new emissions regulations to target diesel, which are also likely to affect existing generators.

"The regulations need to be designed so as to avoid placing undue restrictions on genuine back-up generators, but at the same time limit the extent to which these same generators can run purely for commercial reasons," Mr Howard said.

However, any such restrictions could be a setback for National Grid’s efforts to keep the lights on cost-effectively.

The company is trying to promote "demand side response" schemes where industrial or commercial users reduce their demand on the grid at times when national supplies are scarce.

To date, about 95pc of the capacity procured has come from users switching to alternative sources of power such as diesel engines, rather than actually reducing the total amount of electricity they are using.

A separate review by regulator Ofgem is currently considering removing some of the financial benefits that diesel generators currently enjoy as a result of connecting directly into local distribution networks.

A spokesman for National Grid said: "Demand side measures are good for bill payers as they provide flexibility at a lower cost and help the country shift to a more low-carbon energy system.

"National Grid is obliged to be agnostic about technology and to procure the most cost-effective solutions to help us balance supply and demand. However, the Government is currently examining the regulations surrounding diesel generation. "

SOURCE




Green Fiasco: Biofuels ‘Worse Than Petrol’ For The Environment, New Study Finds

“Green” biofuels such as ethanol and biodiesel are in fact worse for the environment that petrol, a landmark new study has found.
The alternative energy source has long been praised for being carbon-neutral because the plants it is made from absorb carbon dioxide, which causes global warming, from the atmosphere while they are growing.

But new research in the US has found that the crops used for biofuel absorb only 37 per cent of the C02 that is later released into the atmosphere when the plants are burnt, meaning the process actually increases the amount of greenhouse gas in the air.

The scientists behind the study have called on governments to rethink their carbon policies in light of the findings.

The use of biofuels is controversial because it means crops and farm space that could otherwise be devoted to food production are in fact used for energy.

They currently make up just under 3 per cent of global energy consumption, and use in the US grew from 4.2 billion gallons a year in 2005 to 14.6 billion gallons a year in 2013.

In the UK the Renewable Transport Fuel Obligation now means that 4.75 per cent of any suppliers’ fuel comes from a renewable source, which is usually ethanol derived from crops.

Professor John DeCicco, from the University of Michigan, said his research was the first to carefully examine the carbon on farmland where biofuels are grown.

“When you look at what’s actually happening on the land, you find that not enough carbon is being removed from the atmosphere to balance what’s coming out of the tailpipe,” he said.

“When it comes to the emissions that cause global warming, it turns out that biofuels are worse than gasoline.”

Professor DeCicco said the study, which is published in the journal Climatic Change, reset the assumptions, that biofuels, as renewable alternatives to fossil fuels, are inherently carbon neutral simply because the C02 released when burned was originally absorbed from the atmosphere through photosynthesis.

SOURCE




How Britain will keep the lights on

Millions being spent to do what existing infrastructure could do if it was all brought back online

Eight new battery storage projects are to be built around the UK after winning contracts worth £66m to help National Grid keep power supplies stable as more wind and solar farm are built.

EDF Energy, E.On and Vattenfall were among the successful companies chosen to build new lithium ion batteries with a combined capacity of 200 megawatts (MW), under a new scheme to help Grid balance supply and demand within seconds.

Power generation and usage on the UK grid have to be matched as closely as possible in real-time to keep electricity supplies at a safe frequency so that household electrical appliances function properly.

National Grid says that maintaining the correct frequency is becoming more challenging as more renewable generation is built, because this makes the electricity system less stable and leads to more volatile fluctuations in frequency.

As a result, it has launched a new scheme to support technologies such as batteries that can respond within less than a second to either deliver or absorb power to or from the grid, bringing the system back into balance.

Projects with a capacity of more than 1.2 gigawatts entered a competition for contracts to provide this service to the grid.

EDF Energy’s was the biggest individual project to secure a contract, winning a £12m deal to build 49 megawatts (MW) of battery storage by its coal and gas plants at West Burton in Nottinghamshire.

Vattenfall won a contract to build 22MW of batteries next to its Pen y Cymoedd wind farm in Wales, while E.On is to build a 10MW battery by its biomass plant at Blackburn Meadows near Sheffield.

Low Carbon secured £15m of deals to build two projects, one in Kent and one in Cumbria, with a combined capacity of 50MW. The other winners were Element Power, RES and Belectric.

SOURCE  






Feds Fund Scientists Who Protect The ‘Global Warming Paradigm,’ Says Report

The Obama administration has been pumping billions of taxpayer dollars into science that’s “heavily biased in favor of the paradigm of human-induced climate change,” according to a researchers.

Policy experts wanted to know if the lure of federal dollars was biasing climate science research.What they found is the group responsible for a significant portion of government climate science funding seems more concerned with promoting the “anthropogenic global warming” (AGW) paradigm, than studying natural variability in weather patterns.

“In short there appears to be virtually no discussion of the natural variability attribution idea. In contrast there appears to be extensive coverage of AGW issues,” David Wojick, a freelance reporter and policy analyst, wrote in a blog post, referring to research he did with climate scientist Patrick Michaels of the libertarian Cato Institute.

“This bias in favor of AGW has significant implications for US climate change policy,” Wojick wrote for the blog Climate Etc., which is run by climate scientist Judith Curry.

Wojick and Michaels published a working paper in 2015, asking the question: Does federal funding bias climate science?

They conducted a “semantic” analysis of three years of budget requests for the U.S. Global Change Research Program (USGCRP), which usually gets around $2.5 billion. They found USGCRP overwhelmingly used language supporting the AGW paradigm.

“The ratio of occurrences is roughly 80 to one,” Wojick wrote. “This extreme lack of balance between considerations of the two competing paradigms certainly suggests that paradigm protection is occurring.”

Politicians have become more concerned with global warming in recent years, and have been willing to shell out more money for potential solutions to the problem. The Obama administration, for example, reported spending $22.2 billion on global warming efforts in 2013, including $2.5 billion to the USGCRP.

That’s a lot of money, and illustrates why Wojick and Michaels are so concerned about federal money’s influence on science.

“Present policy is based on the AGW paradigm, but if a significant fraction of global warming is natural then this policy may be wrong,” Wojick wrote. “Federal climate research should be trying to solve the attribution problem, not protecting the AGW paradigm.”

Wojick and Michaels have already weighed in on the bias in climate science towards using models, which they say “is a bad thing.”

“Climate science appears to be obsessively focused on modeling,” they wrote in May. “Modeling can be a useful tool, a way of playing with hypotheses to explore their implications or test them against observations. That is how modeling is used in most sciences.”

“But in climate change science modeling appears to have become an end in itself. In fact it seems to have become virtually the sole point of the research,” they wrote. “The modelers’ oft stated goal is to do climate forecasting, along the lines of weather forecasting, at local and regional scales.”

SOURCE  

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Saturday, August 27, 2016


EPA: The Ethanol Protection Agency

Maybe the initials “EPA” should stand for the Ethanol Protection Agency, as environmental protection seems just too costly and time consuming for the agency. The EPA recently admitted that it has not been in compliance with the 2007 law that requires the agency to study the environmental impact of the ethanol mandate and report its findings to Congress every three years. So far, envirofascist bureaucrats have only met the law’s requirements once in 2011. The EPA’s excuse is that it just doesn’t have enough funding or time and therefore has simply ignored the law.

According to the EPA’s inspector general, because the agency hasn’t been conducting the impact study, it hasn’t been able “to identify, consider, mitigate and make policymakers aware of any adverse impacts of renewable fuels.” The EPA now says it won’t have a fully completed study until at least 2024. Well, that’s good news if you’re a corn farmer, but not such good news for the environment or even other corn-dependent industries, as price of corn has increased.

Several recent independent environmental studies conclude that ethanol biofuels have had an overall negative effect on the environment from an increase of smog in cities to the amount of land, water and energy needed to produce ethanol when compared to that of gasoline. Combined with worse mileage and damage to small engines, ethanol has had the exact opposite effect of what it was touted to accomplish when it was initially mandated back in 2005. Yet there is no push by either Congress or the Obama administration to repeal or even question the ethanol mandate. And the agency tasked with protecting the environment seems too busy figuring out how to clean up its own environmental messes to care. Cronyism at its best.

SOURCE  





The few, the loud, the anti-fossil fuel crowd

If you get your news from the mainstream media, you likely think the views expressed by the environmental activists represent the majority of Americans. After all, their highly visible protests against the Keystone pipeline — sit-ins in front of the White House, locking themselves to the White House fence and then being arrested for it, and parading down the National Mall carrying a huge inflated tube emblazoned with the words: “Just say no to Keystone” — were effective. Despite repeated polling that showed a majority of Americans supported the pipeline, with a small minority opposed, the loud theatrics of the anti-fossil fuel crowd eventually won out. After years of stall tactics, President Obama finally bowed to their demands and said no to the job-creating infrastructure project.

Earlier this year, the usual group of suspects, led by well-known anti-fracking activist Bill McKibben, planned a “global wave of resistance” called BreakFree2016 — scheduled to take place from May 3-15  — on six continents. The event’s website announced the various activities, including an appearance and speech by McKibben, a Vermont resident, at the Colorado rally that promised: the “largest mass mobilizations for climate action in the history of Colorado.” It confirmed that there would be “civil disobedience.”

Did you hear about it? Probably not.

A news report of the planned Colorado activities said: “And on May 14, 350 Colorado is planning a day of speeches, live music and activities protesting oil and gas developments close to neighborhoods and schools in Thornton. The goal is to draw 1,000 people to the upcoming events.” The website, post-event, states: “about 800 people joined the action throughout the day” with “about 30-40 people” still there at the end of the day for the dramatic “frack-site” invasion. Yet, as even their own Facebook page photos indicate, not even 100 were present for the big McKibben speech. Without vendors and media, he may have had no audience at all.

After flying in to Denver, and then being driven to the protest site in a limousine, McKibben jetted off to Los Angeles, California, where he was joined by the greens’ “Daddy Warbucks,” billionaire political campaign donor Tom Steyer — with much the same results: a few hundred protesting fossil fuels and, as Energy In Depth reported, “the very social and economic underpinnings of liberal democracy.” The typical anti-everything protestors were present — but only a few.

In Iowa, as I addressed last week, a meeting of the Bakken Pipeline Resistance Coalition — which according to the organizer includes those with “concerns about the impact it could have on the environment, farmers who worry about their cropland and religious groups who view expanding use of fossil fuels as a moral issue because of climate change” — expected a crowd of 200. Instead, according to the Ottumwa Courier, “only 40 or so were seated when the meeting began. Others trickled in as the meeting progressed.”

Now, Colorado is ground zero for “one of the biggest environmental fights in the country this year,” as Lauren Petrie, Rocky Mountain region director for Food and Water Watch, a Washington, D.C.-based group advocating for safety in food production and oil and gas production, called it. Two ballot initiatives, 75 and 78, have the potential to, according to Colorado regulators, “effectively halt new oil and gas development in as much as 90 percent of the state.” In order to get the initiatives on the ballot, 98,492 valid signatures needed to be turned into the Colorado Secretary of State by August 8 — no later than 3:00 p.m.

In June, The Tribune reported that Tricia Olson, who has pumped in most of the funding for a group backing initiatives 75 and 78, hoped to “collect 160,000 signatures to account for the invalid signatures that inevitably pop up.” (Politico just announced: “recent campaign finance reports were filed with the Colorado secretary of state, the Sierra Club gave $150,000, making it the largest single reported contributor to the anti-fracking effort.”)

Because the Colorado Supreme Court, in a unanimous decision on May 2, declared local fracking limits “invalid and unenforceable,” as state law trumps local ordinances, Olson sees the ballot initiatives as their “last ditch effort.”

On Monday, August 8, exercising stagecraft, at 2:30 p.m., dozens of supporters emptied a U-Haul truck and delivered box after box of signatures to the Secretary of State’s office. They celebrated their “victory.” 350 Colorado, one of the groups behind the measures, proclaimed: “We did it! Over 100,000 signatures delivered on initiatives to limit fracking!” — not the 160,000 originally hoped for, and likely not enough to get on the ballot in November.

By CBS Denver’s accounting about 105,000 signatures were turned in — most in half empty boxes. Lynn Bartels, Colorado Secretary of State Communications Director, tweeted: “Proponents of fracking measures turned in lots of boxes with very few petitions in them.” Once the petitions were consolidated, there were roughly 50 empty boxes. Simon Lomax, an associate energy policy analyst with the conservative Independence Institute in Denver and a consultant who advises pro-business groups, said: “To make it look more impressive they added a bunch of empty boxes, or boxes with very few petitions. It just sort of shows, these groups don’t do substance, they just do deceptive publicity stunts.”

On CBS Denver, former Secretary of State Scott Gessler explained that you need about 98,000 signatures to get on the ballot because, for a variety of reasons, at least 30 percent are rejected, you need to submit at least 140,000. He says that for the 105,000 signatures turned in to qualify would be “unprecedented,” something that “has never occurred in Colorado for a ballot initiative.” According to Gessler, the effort is “doomed” — though we will not know for sure until next month when the final counts are released.

Noted election reporter and national affairs columnist for the National Review, John Fund, told me: “If there is enough public support for an issue to get the votes needed to pass, getting a surplus of signatures to get it on the ballot is an easy task.”

Many Democrats, including Governor John Hickenlooper, support hydraulic fracturing and have come out against the ballot initiatives. Politico posits that because mainstream environmentalists “fear that their movement will suffer a demoralizing defeat if the two proposals make it in front of the voters,” they “hope the ballot initiatives will die instead.”  Additionally, “A decisive referendum on oil and gas production would increase calls for [Hillary] Clinton to explicitly take a side.” She’s previously aligned with 75 and 78 — which could spoil her attempts to attract moderate Republicans she’ll need to win the state.

Despite their drama and declared “victory,” it doesn’t seem that the Colorado anti-fossil fuel crowd has enough signatures, or support, to make it onto the November ballot. They may be loud, but, alas, they are few.

SOURCE  





Gov’t Is Moving In on Your Appliances – Expect Higher Prices, Fewer Choices

The Obama administration’s Department of Energy has churned through a list of energy efficiency regulations before the next administration. Just since June, the DOE has set or initiated standards for dehumidifiers, ceiling fans, battery chargers, and wine coolers.

At issue isn’t health or safety, or even unfair business practices. Through the DOE, the federal government is busying itself regulating how much energy the appliances Americans buy are allowed to use. Our recent backgrounder, “The Energy Efficiency Free Market Act: A Step Toward Real Energy Efficiency,” goes into more detail.

Take a look around your kitchen. Many of the appliances are also regulated by the federal government, from the oven and refrigerator, down to the standby light on the microwave. Step outside to other rooms and even outdoors. TVs, showers, air conditioners and heaters, washers and dryers, backyard swimming pools, toilets—these are just some of the other things regulated by the federal government.

Energy efficiency isn’t a bad thing. In fact, it’s an important factor in many Americans’ purchasing decisions. But there are a number of reasons why the federal government should not be mandating it:

Energy efficiency regulations reduce choices. Regulations prioritize efficiency over other preferences like safety, size, performance, durability, and cost. Americans weren’t without energy-efficient appliances before. The Department of Energy is essentially trying to make “better” decisions for people by limiting their options to “acceptable,” energy-efficient ones.

Regulations have very little impact on greenhouse gas emissions. Regardless of one’s opinion on global warming, these regulations have almost no impact. The DOE’s projected benefits from reducing greenhouse gas emissions total a paltry 1 percent.

Savings benefit the rich, often at the expense of the poor. According to Sofie Miller at the George Washington University Regulatory Studies Center, DOE cost-benefit calculations best describe households making $160,844 or more. In reality, energy-efficiency costs and benefits vary widely depending on income, education, and race. Higher energy costs impact poor families the most.

Mandates hinder innovation. Announcing the Energy Efficiency Free Market Act, Rep. Michael Burgess, R-Texas, explained that “when the government sets the efficiency standard for a product, that often becomes the ceiling … when the market drives the standard, there’s no limit to how fast and how aggressive manufacturers will be when consumers demand more efficient and better made products.”

Savings promised by standards are misleading. Considering the costs and benefits, Americans are essentially paying to have their choices restricted. There have also been problems with how the DOE estimates upfront costs, payback horizons, overstated energy savings, and future energy prices. For example, the DOE assumed in a washing machine proposed rule that households used washers 392 times per year—more than seven times per week—meaning most families would never reap the benefits of more efficient, but more expensive, washers.

Standards easily play into corporate welfare. Companies lobby for regulations and subsidies that most benefit them, in an attempt to squeeze out competition from smaller companies. If these products save customers as much as advertised, they should not be subsidized by the taxpayer.

But if the government didn’t set mandates, wouldn’t companies stop producing energy-efficient products? Refrigerators, which the DOE points to as a success story, are just one example of why there’s no reason to worry:

“The Standards Program has driven remarkable gains in the energy efficiency of household appliances and equipment, resulting in large energy bill savings. For example, today, the typical new refrigerator uses one-quarter the energy than in 1973—despite offering 20 percent more storage capacity and being available at half the retail cost.”

One problem: The first federal efficiency standards for refrigerators did not go into effect until 1990. Refrigerator manufacturers were improving energy use and design for nearly two decades before the government got involved.

As customers, Americans put importance on energy efficiency without the government nudging. And the free market is only too willing to supply.

Over the years, the DOE—empowered by Congress through the Energy Policy Conservation Act of 1975—has quietly expanded the list of products it regulates for energy efficiency. Congress should eliminate all mandatory efficiency regulations and leave these decisions to state governments and American consumers.

SOURCE  





Supposed ‘Ground Breaking’ Study Only Proves Warming Proponents Have Jumped The Shark

They called it a “ground breaking study,” I call it rubbish. This new study claims that global warming began in 1830 just when the industrial revolution began to pick up steam (no pun intended). What they didn’t take into account was just around the same time the Earth was coming out of an unusually cold 40-year period caused by low sunspot activity called the Dalton Minimum (no relation to Timothy Dalton).

An international team of scientists, led by Associate Professor Nerilie Abram from the Australian National University, have analysed detailed reconstructions of climate going back 500 years. To their surprise, they’ve found that the current global warming trend began in the 1830s, further confirming that it is an anthropogenic, or human-induced, phenomenon. The study was published today in Nature.

Co-researcher Dr Helen McGregor, an earth sciences expert from the University of Wollongong, tells SBS Science the findings have a major impact on our understanding of how climate change works. “If we know when global warming started, we know what the actual rates of warming are and we know when our climate is emerging above natural variability,” McGregor explains.

The scientists go on to explain they created a climate model (which have proven to be very flawed–for example none of these models have figured why the earth hasn’t warmed in over 18 years.).  So to create this model they took into account other account climate model simulations and experiments (that’s right a flawed climate model using data from a flawed climate model–almost like a double negative), major volcanic eruptions and, most importantly, natural markers of climate variation found in places like corals, tree rings, and ice cores obtained from glaciers.

Dr McGregor says the study provides new, independent proof that climate change is indeed caused by human activity.

“One thing that our study provides is that it’s an alternative line of evidence,” she explains. “We’re not using thermometers and satellite records, we’re using natural archives of climate, so it’s a completely independent source of information that shows that climate change and warming is occurring.

“The central tenet of climate change, that the planet is warming, doesn’t change.”

Well not necessarily, because nowhere in their analysis do the scientists take into account sunspot activity.

Note: The sun goes through a natural cycle approximately every 11 years. The greatest number of sunspots in any given solar cycle is designated as the “solar maximum” and the lowest number is referred to as the “solar minimum” phase.

What scientists have observed is that when sun spot activity is low so is the earth’s temperatures. The time period of low sunspot activity below called the Maunder Minimum is also known as “The Little Ice Age.” not because glaciers covered the Earth, but because it was a long period of abnormally cold weather throughout the world. The period of low sunspot activity between 1790-1830 is known as the Dalton Minimum, again the weather was colder than normal.

In the late 1950s sun spot activity peaked at a much higher level than normal and was called the Modern Maxim, this was reflected in the global warming scare show global temperature growth accelerating.

It seems as if the scientists behind this”ground breaking study,” picked the result they wanted and selected the elements that would give them that result.

Now here’s the good news they might have ignored.  It seems that solar activity is slowing down. The in the chart above it seems that activity started to decrease toward the end of the 1990s. Similarly the satellite temperature data shows the Earth hasn’t warmed since 1998.

Vencore Inc.is  a company that has worked closely with a number of government agencies on weather-related projects, including NASA, NOAA, Naval Meteorological and Oceanographic Command, Naval Postgraduate School and the Intelligence Community. It is now suggesting that the extreme lack of sunspot activity now may be an indication of a major cooling period for the Earth.

Not since cycle 14 peaked in February 1906 has there been a solar cycle with fewer sunspots. We are currently more than six years into Solar Cycle 24 and the current nearly blank sun may signal the end of the solar maximum phase. Solar cycle 24 began after an unusually deep solar minimum that lasted from 2007 to 2009 which included more spotless days on the sun compared to any minimum in almost a century.

It’s not just the fewer number of sunspots…its the pattern of their peaks:

The smoothed sunspot number for solar cycle 24 reached a peak of 81.9 in April 2014 and it is looking increasingly likely that this spike will be considered to be the solar maximum for this cycle. This second peak in the cycle surpassed the level of an earlier peak that reached 66.9 in February 2012. Many solar cycles are double peaked; however, this is the first one in which the second peak in sunspot number was larger than the first peak. Going back to 1755, there have been only a few solar cycles in the previous 23 that have had a lower number of sunspots during its maximum phase.

Now that doesn’t mean it’s definitely staying that way..but chances are it will. And here is where it gets interesting:

It is pretty well understood that solar activity has a direct impact on temperatures at very high altitudes in a part of the Earth’s atmosphere called the thermosphere. This is the biggest layer of the Earth’s atmosphere which lies directly above the mesosphere and below the exosphere. Thermospheric temperatures increase with altitude due to absorption of highly energetic solar radiation and are highly dependent on solar activity.

Finally, if history is a guide, it is safe to say that weak solar activity for a prolonged period of time can have a cooling impact on global temperatures in the troposphere which is the bottom-most layer of Earth’s atmosphere – and where we all live.

Vencore’s prediction substantiates paper written by Russian scientists in 2013 who used sunspot activity to predict we are heading for a “Mini Ice Age.”

The German Herald reported on March 31, 2013 regarding Russian scientist Dr Habibullo Abdussamatov from the St. Petersburg Pulkovo Astronomical Observatory, “Talking to German media the scientist who first made his prediction in 2005 said that after studying sunspots and their relationship with climate change on Earth, we are now on an ‘unavoidable advance towards a deep temperature drop.’”

There is a simple reason that the scientists that created the “ground breaking study,” ignored solar activity, it would disprove their hypothesis.  Like many scientists trying to push the global warming/climate change hypothesis, these scientists have jumped the shark..er sunspots

SOURCE  





Ross McKitrick: Wind Power Subsidies Triple Power Prices in Ontario

One of the favourite smoke-and-mirrors lines pulled by the wind industry, its parasites and spruikers is that wind power lowers power prices.

Among the ‘tiny’ little omissions in that pitch are that:

1) they’re only ever talking about spot prices when the wind is blowing; and

2) they skate over the massive subsidies that get tacked on top of the price paid by retailers for the power delivered; and

3) they run a mile from the unnecessary cost of base-load plants holding additional ‘spinning reserve’ and the insane and otherwise unnecessary cost of running highly inefficient Open Cycle Gas Turbines, that are critical to keep a grid up and running when wind power output collapses on a total and totally unpredictable basis.

That little trick lasts about as long as it takes Joe the Power Punter to open his power bill; because all of the above is helpfully collected in the staggering retail cost, as a bottom line that jumps off the page with a heart shuddering reality – crushing households and killing business, growth and employment.

The rocketing bills being dropped on power consumers in Ontario  pick up the price paid for the most bizarre energy policy on the Planet – a power tax called the GA or Global Adjustment levy, used to subsidise wind power. Here’s what it costs and why.

You may be surprised to learn that electricity is now cheaper to generate in Ontario than it has been for decades. The wholesale price, called the Hourly Ontario Electricity Price or HOEP, used to bounce around between five and eight cents per kilowatt hour (kWh), but over the last decade, thanks in large part to the shale gas revolution, it has trended down to below three cents, and on a typical day is now as low as two cents per kWh. Good news, right?

It would be, except that this is Ontario. A hidden tax on Ontario’s electricity has pushed the actual purchase price in the opposite direction, to the highest it’s ever been. The tax, called the Global Adjustment (GA), is levied on electricity purchases to cover a massive provincial slush fund for green energy, conservation programs, nuclear plant repairs and other central planning boondoggles. As these spending commitments soar, so does the GA.

In the latter part of the last decade when the HOEP was around five cents per kWh and the government had not yet begun tinkering, the GA was negligible, so it hardly affected the price. In 2009, when the Green Energy Act kicked in with massive revenue guarantees for wind and solar generators, the GA jumped to about 3.5 cents per kWh, and has been trending up since — now it is regularly above 9.5 cents. In April it even topped 11 cents, triple the average HOEP.

So while the marginal production cost for generation is the lowest in decades, electricity bills have never been higher. And the way the system is structured, costs will keep rising.

The province signed long-term contracts with a handful of lucky firms, guaranteeing them 13.5 cents per kWh for electricity produced from wind, and even more from solar. Obviously, if the wholesale price is around 2.5 cents, and the wind turbines are guaranteed 13.5 cents, someone has to kick in 11 cents to make up the difference. That’s where the GA comes in. The more the wind blows, and the more turbines get built, the bigger the losses and the higher the GA.

Just to make the story more exquisitely painful, if the HOEP goes down further, for instance through technological innovation, power rates won’t go down. A drop in the HOEP widens the gap between the market price and the wind farm’s guaranteed price, which means the GA has to go up to cover the losses.

Ontario’s policy disaster goes many layers further. If people conserve power and demand drops, the GA per kWh goes up, so if everyone tries to save money by cutting usage, the price will just increase, defeating the effort. Nor do Ontarians benefit through exports. Because the renewables sector is guaranteed the sale, Ontario often ends up exporting surplus power at a loss.

The story only gets worse if you try to find any benefits from all this spending. Ontario doesn’t get more electricity than before, it gets less.

Despite the hype, all this tinkering produced no special environmental benefits. The province said it needed to close its coal-fired power plants to reduce air pollution. But prior to 2005, these plants were responsible for less than two per cent of annual fine particulate emissions in Ontario, about the same as meat packing plants, and far less than construction or agriculture.

Moreover, engineering studies showed that improvements in air quality equivalent to shutting the plants down could be obtained by simply completing the pollution control retrofit then underway, and at a fraction of the cost. Greenhouse gas emissions could have been netted to zero by purchasing carbon credits on the open market, again at a fraction of the cost. The environmental benefits exist only in provincial propaganda.

SOURCE  





Wind Power Obsession Sends South Australians Back to the Stone Age

Amidst the panic and chaos being experienced by the wind industry, its parasites and spruikers – due to the unfolding and inevitable wind power calamity in South Australia – one of the newly invented catchphrases is “transition”.

It’s a term now employed by wind spinners, dimwitted politicians and gullible journalists; and is often coupled up with lines such as “interconnectors”; “rapidly improving battery technology” and “gas”.  Gas, apparently, is now seen as a “transition” fuel to a … ahem … fossil fuel free future and the interconnectors proposed would connect to coal-fired plant currently chugging away in Victoria and New South Wales [note to Ed is this ‘pure irony’?]

Last time we took a peek at the climate-calamatists’ websites, gas was right up there with coal as the source of all peril and evil on earth, so we’re not sure that the Chicken Littles will buy the line about gas being anything other than a ‘spawn-of-the-Devil’ fossil fuel.

And adding ‘fuel’ to the fire, the gas destined for this “transition” isn’t going to be used in highly efficient Combined Cycle plants, but squandered in gas-thirsty and highly inefficient Open Cycle plants that emit 3-4 times the CO2 per MWh of a modern coal-fired plant.

Open Cycle Gas Turbines (OCGTs) are literally jet engines, run on gas or fuel oil (diesel) or kerosene. The initial capital outlay is low, but their operating costs are exorbitant – depending on the fuel input costs (the gas dispatch price varies with demand, for example) operators need to recoup upwards of $300-400 per MWh before they will even contemplate firing them into action. For a wrap up on “fast-start-peakers” see this paper: Peaker-Case-Histories As to the insane cost of running them, see this article: OPEN GAS CYCLE TURBINES: Between a rock and a hard place

And the line about “transitioning” to a wind powered future with “rapidly improving battery technology” comes sprinkled with a fair dose of pixie dust: nowhere in the world is there an example of grid-scale electricity storage using batteries (of any description); not in Germany; not in Spain; not in Denmark; not in California; not in South Australia – or anywhere else stupid enough to attempt to run on sunshine and breezes.

Now that the mainstream press have caught up with the energy disaster that is South Australia, journos are, for the first time in their lives, starting to grapple with the tricky concept of electricity generation: terms such as “load following”; “frequency control”; and “grid balancing” are starting to find their way into the pages of the Australian Financial Review and The Australian.

These aren’t just fancy nouns and verbs of recent invention, they go right to the heart of whether customers at the thinnest end of an electricity grid get to enjoy electricity on demand, or at all.

What media hacks are starting to understand is that there is a world of difference between the quality of electricity produced by conventional generation sources; and that thrown occasionally into the grid by a wholly weather dependent source, abandoned centuries ago, for pretty obvious reasons – eg, SA’s wind farm’s efforts in April:

It’s not just a question of delivering power when and where it’s needed; frequency control is a matter that determines whether a grid functions at all (see our post here).

Where the chaos and intermittency of wind power destabilises the grid (see our post here), it’s down to conventional generation sources that can ramp up output at the press of a button to keep the grid alive: “reactive power” that allows for the 50Hz frequency of the grid to be controlled and maintained around close tolerances.

In a place like South Australia, where wind power capacity tops 40% of its entire generating capacity, every time a breeze turns to a zephyr, voltage and frequency drops, which requires an instantaneous response from coal or gas-fired generators (hydro is exceptionally good at responding in an instant) – with recent efforts to rely on the chaotic delivery of wind power, those selling power for frequency control and load following now recoup a very solid premium for their service.

Remove that class of generator from the system and the wind cultist and his fellow travelers are soon left tossing chaff about the wonders of wind, while sitting freezing in the dark.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************