Icing The Hype
Aug 28, 2012
Sea Ice News - other sources show no record low Arctic ice extent

By Anthony Watts

See this comprehensive post on arctic ice disappearance in the middle 1900s.

Earlier today in part1, I posted about the new record low claimed by NSIDC: Sea Ice News - Volume 3 Number 11, part 1 new Arctic satellite extent record. The number given is 4.1 million square kilometers:

image

That of course is being trumpeted far and wide, new life has been given to Mark Serreeze’s “Arctic death spiral” in the media. But, here’s a curiosity, another NSIDC product, the new and improved “multi-sensor” MASIE product, shows no record low at 4.7 million square kilometers:

image

Note the label at the bottom of the image in red. NSIDC doesn’t often mention this product in their press releases. They most certainly didn’t mention it today.

Another product, NOAA’s National Ice Center Interactive Multisensor Snow and Ice Mapping System (IMS) plot, also shows no reason for claiming a record at all:

image

Their number is (for 8/22) - 5.1 million square kilometers. (NOTE: NSIDC’s Dr. Walt Meir points out in comments that IMS and MASIE use the same base data, but that this one product from IMS only updates weekly, unlike all other sea ice plots which are daily. They should be in sync on the next update cycle, but right now MASIE and IMS should both be at 4.7 million sqkm. -A)

Another curiosity is here. On the NATICE interactive maps on demand page (click on Arctic Daily in the pulldown menu):

The numbers they give for 80% and marginal ice add up to an extent of 6,149,305 square kilometers.

So who to believe? It depends on the method, and who thinks their method is most representative of reality. Measuring sea ice via satellite, especially when you use a single passive sensor system that has been show in the past to have degradation problems and outright failure (which I was told weren’t worth mentioning until they discovered I was right and pulled the plug) might be a case of putting all your eggs in one basket. I suspect that at some point, we’ll see a new basket that maybe isn’t so worn, but for now, the old basket provides a comfort for those who relish new records, even though those records may be virtual.

Note that we don’t see media pronouncements from NOAA’s NATICE center like “death spiral” and “the Arctic is screaming” like we get from its activist director, Mark Serreze. So I’d tend to take NSIDC’s number with a grain of salt, particularly since they have not actively embraced the new IMS system when it comes to reporting totals. Clearly NSDIC knows the value of the media attention when they announce new lows, and director Serreze clearly knows how to make hay from it.

But this begs the question, why not move to the new system like NOAA’s National Ice Center has done? Well, it is a lot like our July temperature records. We have a shiny new state of the art Climate Reference Network system that gives a national average that is lower for July than the old USHCN network and all of its problems, yet NCDC doesn’t tell you about the July numbers that come from it. Those tasks were left to Dr. Roy Spencer and myself.

In fairness though, I asked Dr. Walt Meier of NSIDC what he thought about MASIE, and this is what he wrote to me today:

It can provide better detail, particularly in some regions, e.g., the Northwest Passage.

However, it’s not as useful for looking at trends or year-to-year variations because it is produced from imagery of varying quantity and quality. So the analyses done in 2007 have different imagery sources than this year. And imagery varies even day to day. If skies are clear, MODIS can be used; if it’s cloudy then MODIS is not useful. Another thing is that the imagery is then manually analyzed by ice analysts, so there is some subjectivity in the analysis - it may depend on the amount of time an analyst has in a given day.

Our data is from passive microwave imagery. It is not affected by clouds, it obtains complete data every data (except when there may be a sensor issue), it has only consistent, automated processes. So we have much more confidence in comparing different days, years, etc. in our passive microwave data than is possible using MASIE.

Finally, MASIE’s mandate is to try to produce the best estimate they can of where there is any sea ice. So they may include even very low concentrations of ice <15%. In looking at visible imagery from MODIS, in the few cloud-free regions, there does appear to be some small concentration of ice where MASIE is mapping ice and our satellite data is not detecting ice. This is ice that is very sparse, likely quite thin. So it will probably melt out completely in the next week or two.

MASIE has tended to lag behind our data and then it catches up as the sparse ice that they map disappears. This year the difference between the two is a bit larger than we've seen in other years, because there is a larger area of sparse ice.

You can thank the big Arctic storm of August 4th-8th for that dispersal.

"The Great Arctic Cyclone of 2012" effect on Arctic sea ice is seen in this before and after image.

image

Figure 4. These maps of sea ice concentration from the Special Sensor Microwave Imager/Sounder (SSMIS) passive microwave sensor highlight the very rapid loss of ice in the western Arctic (northwest of Alaska) during the strong Arctic storm. Magenta and purple colors indicate ice concentration near 100%; yellow, green, and pale blue indicate 60% to 20% ice concentration. Credit: National Snow and Ice Data Center courtesy IUP Bremen High-resolution image

Trends -vs- records, just like July temperatures. One system might be better at trends, another might be better at absolutes used to determine records. In this case we have three other respected methods that show absolute values higher than that of NSIDC’s older method which they have a high confidence in. I suppose these systems are like children. In a competition, you always root for your children over the children of the other parents, so it is no surprise that NSIDC would root for their own well known media star “child” over that of NATICE’s IMS and NSIDC’s own lesser known child, MASIE.

Oh, and then there’s Antarctica, that other neglected ice child nobody talks about, with its above normal ice amounts right now.

No matter what though, its all just quibbling over just a little more than 30 years of satellite data, and it is important to remember that. It is also important to remember that MASIE wasn’t around during the last record low in 2007, and IMS was just barely out of beta test from 2006. As measurement systems improve, we should include them in the discussion.

UPDATE: Andrew Revkin reports on the issue in his Dot Earth article. He’s a bit skeptical of the sound byte hype coming from NSIDC writing:


That’s one reason that, even with today’s announcement that the sea ice reached a new low extent for the satellite era, I wouldn’t bet that “the Arctic is all but certain to be virtually ice free within two decades,” as some have proposed. I’d say fifty/fifty odds, at best...But is this a situation that is appropriately described as a “death spiral”? Not by my standards.

Revkin also takes Al Gore to task on Twitter:

image

help him out, retweet this

UPDATE2:  Commenter Ron C. provides this useful information in comments that helps explain some of the differences and issues:

The main point is that NIC works with images, while the others are microwave products.

“Polar orbiting satellites are the only source of a complete look at the polar areas of the earth, since their orbits cross near the poles approximately every two hours with 12 to 13 orbits a day of useful visible data. This visible imagery can then be analyzed to detect the snow and ice fields and the difference in reflectivity of the snow and ice. By analyzing these areas each day, areas of cloud cover over a particular area of snow and ice can be kept to a minimum to allow a cloud free look at these regions. This chart can then be useful as a measure of the extent of snow and ice for any day during the year and it can also be compared to previous years for climatic studies.”

“NIC charts are produced through the analyses of available in situ, remote sensing, and model data sources. They are generated primarily for mission planning and safety of navigation. NIC charts generally show more ice than do passive microwave derived sea ice concentrations, particularly in the summer when passive microwave algorithms tend to underestimate ice concentration. The record of sea ice concentration from the NIC series is believed to be more accurate than that from passive microwave sensors, especially from the mid-1990s on (see references at the end of this documentation), but it lacks the consistency of some passive microwave time series.” http://nsidc.org/data/g02172.html

Some have analyzed the underestimation by microwave products.

“We compare the ice chart data to ice concentrations from the NASA Team algorithm which, along with the Bootstrap algorithm [Comiso, 1995], has proved to be perhaps the most popular used for generating ice concentrations [Cavalieri et al.,1997]. We find a baseline difference in integrated ice concentration coverage north of 45N of 3.85% plus or minus 0.73% during November to May (ice chart concentrations are larger). In summer, the difference between the two sources of data rises to a maximum of 23% peaking in early August, equivalent to ice coverage the size of Greenland.”

From Late twentieth century Northern Hemisphere sea-ice record from U.S. National Ice Center ice charts, Partington, Flynn, Lamb, Bertoia, and Dedrick

The differences are even greater for Canadian regions.

“More than 1380 regional Canadian weekly sea-ice charts for four Canadian regions and 839 hemispheric U.S. weekly sea-ice charts from 1979 to 1996 are compared with passive microwave sea-ice concentration estimates using the National Aeronautics and Space Administration (NASA) Team algorithm. Compared with the Canadian regional ice charts, the NASA Team algorithm underestimates the total ice-covered area by 20.4% to 33.5% during ice melt in the summer and by 7.6% to 43.5% during ice growth in the late fall.”

From: The Use of Operational Ice Charts for Evaluating Passive Microwave Ice Concentration Data, Agnew and Howell


Aug 23, 2012
Manmade Contribution to Global Warming is NOT a Planetary Emergency

by Vaclav Klaus (President, Czech Republic) August 2012

Many thanks for the invitation to attend your conference and to speak here. I appreciate that a mere politician, a former economist, has been invited to address this well-known gathering of highly respected scientists. If I understand it correctly, this year’s seminar is devoted to the discussion of the role of science and of “planetary emergencies”.

To the first topic, I want to say very clearly that I don’t see a special role for science which would be different from doing science. I have, of course, in mind “normal science”, not a “post-normal science” whose ambitions are very often connected with political activism. The role of scientists is not in speculating on the probabilities of events that cannot be directly measured and tested, nor in promoting a pseudo-scientific “precautionary principle”, nor in engaging in activities which are the proper function not of scientists but of risk managers.

To the second topic, I have to say that as a conservatively-minded person, I am unaware of any forthcoming “planetary emergency”, with the exception of those potential situations which would be the consequences of human failures - of human fanaticism, of false pride, and of lack of modesty. But these are problems of political systems and of ideologies.

I have, of course, in mind “normal science”, not a “post-normal science” whose ambitions are very often connected with political activism.

I am not a climatologist, but the IPCC and its leading spokespersons are not climatologists either.

This brings me to the topic of my speech. I will try to argue that current as well as realistically foreseeable global warming, and especially Man’s contribution to it, is not a planetary emergency which should bother us.

I am not a climatologist, but the IPCC and it leading spokespersons are not climatologists either. I am content to be a consumer of climatology and its related scientific disciplines. In this respect, I am located in the economic jargon on the demand side of climatology, not on the supply side.

There are many distinguished scientists here, and some of them are on the other side. I have no intention to break into their fields of study. By expressing my doubts about a simple causal relationship between human CO2 emissions and climate, I do not have the slightest ambition to support one or another competing scientific hypothesis concerning the factors leading to global warming (or eventually cooling).

Nevertheless, my reading both of the available data and of conflicting scientific arguments and theories allows me to argue that it is not global warming caused by human activity that is threatening us.  The real problem is not climate or global warming, but the Global Warming Doctrine and its consequences.

Believers in the global warming doctrine have not yet presented its authoritative text, its manifesto. One of the reasons is that no one wants to be explicitly connected with it.

My views about this issue have been expressed in a number of speeches and articles in the last couple of years all over the world. The book “Blue Planet in Green Shackles” has already been published in 18 languages, last month even in Indonesian. The subtitle of the book asks, “What is Endangered: Climate or Freedom?” The real problem is not climate or global warming, but the Global Warming Doctrine and its consequences. They may eventually bring us close to a real planetary emergency. Absolutely unnecessarily, without any connection with global temperature.

This doctrine, as a set of beliefs, is an ideology, if not a religion. It lives independently on the science of climatology. Its disputes are not about temperature, but are part of the +conflict of ideologies”. Temperature is used and misused in these disputes. The politicians, the media and the public - misled by the very aggressive propaganda produced by the adherents of the global warming doctrine - do not see this. It is our task to help them to distinguish between what is science and what is ideology.

It comfortably dwells in the easy and self-protecting world of false interdisciplinarity which is really a non-disciplinarity, it is an absence of discipline. Believers in the global warming doctrine have not yet presented its authoritative text, its manifesto. One of the reasons is that no one wants to be explicitly connected with it. Another is that to put such a text together would be difficult because this doctrine is not a monolithic concept which can be easily summarized. Its subject-matter does not belong to any single science. It presents itself as a flexible, rather inconsistent, loosely connected cascade of arguments, which is why it has quite successfully escaped the scrutiny of science. It comfortably dwells in the easy and self-protecting world of false interdisciplinarity which is really a non- disciplinarity, it is an absence of discipline.

My reading of this new incarnation of environmentalism can be summarized in the following way:

1. It starts with the claim that there is an undisputed and undisputable, empirically confirmed, statistically significant, global, not local, warming;

2. It continues with the argument that the time series of global temperature exhibit a growing trend which dominates their cyclical and random components. This trend is supposed to be non-linear, perhaps exponential;

3. This trend is declared to be dangerous for the people (in the eyes of “soft” environmentalists) or for the planet (by “deep” environmentalists);

4. This temperature growth is postulated as a solely or chiefly man-made phenomenon attributable to growing emissions of CO2 from industrial activity and the
use of fossil fuels;

5. The sensitivity of global temperature to even small variations in CO2 concentration in the atmosphere is supposed to be very high;

6. Exponents of the global warming doctrine promise us a solution: the ongoing temperature increase can be reversed by radical reduction in CO2 emissions

7. They also know how to bring about their solution: they want to organize emissions reduction by means of the institutions of “global governance”. They forget to tell us that this is not possible without undermining democracy, the independence of individual countries, human freedom, economic prosperity and a chance to eliminate poverty in the world;

8. They rely on the undefined and undefinable “precautionary principle”. Cost-benefit analysis is not relevant to them.


Aug 14, 2012
125 years later, wind power still needs a subsidy

Thomas Pyle, Washington Examiner

image

The Production Tax Credit (PTC) for wind energy is fast becoming the zombie of taxpayer nightmares.  Every time you think this special interest giveaway is dead, Sen. Chuck Grassley, R-Iowa, and his alliance of subsidy hunting policymakers conduct a legislative seance and conjure it from the great beyond.

Just before recessing for the month of August, the Senate Finance Committee approved a plan extending tax incentives for wind and other renewables. Smarting from his recent challenge from a Tea Party backed insurgent, Utah Sen. Orrin Hatch seemed quick to forget its lesson as he supported the PTC extension. Now that he’s safe, he’s free to go back to his big-spending ways for another six years.

Congress has been supporting wind production since at least 1978 on the premise that wind is an infant industry that needs just a few more years of mother’s milk - i.e. taxpayer handouts - to be cost-competitive with more affordable and reliable sources of energy.

But wind has to be one of the oldest infant industries on the planet. In 1882, Thomas Edison built the Pearl Street Station in New York City - a coal fired power plant. A mere 5 years later, a Scottish academic named James Blyth built a wind turbine to make electricity and run the lights on his cabin. After 125 years of generating electricity, you would think that wind would be ready to stand on its own without special favors from the federal government, but apparently it is not.

Wind proponents have been telling us since at least the early 1980s that wind is almost cost-competitive with coal and natural gas. The American Wind Energy Association asserts that wind is “cost-competitive with virtually all other new electricity generation sources.” If so, why are wind proponents still asking for help through the Production Tax Credit?

The reason is plain. The PTC is a large portion of the wholesale price of electricity, giving wind producers the incentive to produce electricity even when they have to pay the electrical grid to take the power they generate.

Specifically, the PTC is a credit of 2.2 cents per kilowatt-hour of electricity produced from wind (and other specified sources). The wholesale price of electricity is less than 3 cents per kilowatt-hour in some markets to about 4.5 cents per kilowatt hour. This makes the PTC worth 50 to 70 percent of the wholesale price of electricity.

The law as currently written provides the two things the wind proponents claim they want - certainty and a phase out of the tax credit. As the law stands, the tax credit ends at the end of the year. This is definitely a certain outcome and a phase out.

Wind has a long history and it continues to be expensive, inefficient, and unsustainable. It’s about time that Congress ends the Production Tax Credit once and for all.

Thomas Pyle is president of the Institute for Energy Research.

According to the Joint Committee on Taxation (’JCT’wink, between 1992 and 2010, the cumulative cost of the PTC was approximately $7.9 billion. In the 2011-2015 budget window, the PTC is estimated to cost American taxpayers another $9.1 billion of which about 75% will be claimed by the wind industry. These costs are in addition to the anticipated $22.6 billion in direct cash outlays under the Section 1603 grant program which expired in 2011.


Aug 08, 2012
A New Analysis of U.S. Temperature Trends Since 1943

By Dr. Roy Spencer

A New Analysis of U.S. Temperature Trends Since 1943
August 6th, 2012

With all of the hoopla over recent temperatures, I decided to see how far back in time I could extend my U.S. surface temperature analysis based upon the NOAA archive of Integrated Surface Hourly (ISH) data.

The main difference between this dataset and the others you hear about is that trends are usually based upon daily maximum and minimum temperatures (Tmax and Tmin), which have the longest record of observation. Unfortunately, one major issue with those datasets is that the time of day at which the maximum or minimum temperature is recorded makes a difference, due to a double-counting effect. Since the time of observation of Tmax and Tmin has varied over the years, this potentially large effect must be adjusted for, however imperfectly.

Here I will show U.S. temperature trends since 1943 based upon 4x per day observations, always made at the same synoptic times 00, 06, 12, and 18 UTC. This ends up including only about 50 stations, roughly evenly distributed throughout the U.S., but I thought it would be a worthwhile exercise nonetheless. Years before 1943 simply did not have enough stations reporting, and it wasn’t until World War II when routine weather observations started being made on a more regular and widespread basis.

The following plot shows monthly temperature departures from the 70-year (1943-2012) average, along with a 4th order polynomial fit to the data, and it supports the view that the 1960s and 1970s were unusually cool, with warmer conditions existing in the 1940s and 1950s (click for large version):

image
Enlarged

It’s too bad that only a handful of the stations extend back into the 1930’s, which nearly everyone agrees were warmer in the U.S. than the 40’s and 50’s.

What About Urban Heat Island Effects?

Now, the above results have no adjustments made for possible Urban Heat Island (UHI) effects, something Anthony Watts has been spearheading a re-investigation of. But what we can do is plot the individual station temperature trends for these ~50 stations against the population density at the station location as of the year 2000, along with a simple linear regression line fit to the data:

image
Enlarged

It is fairly obvious that there is an Urban Heat Island effect in the data which went into the first plot above, with the most populous stations generally showing the most warming, and the lowest population locations showing the least warming (or even cooling) since 1943. For those statisticians out there, the standard error of the calculated regression slope is 29% of the slope value.

So, returning to the first plot above, it is entirely possible that the early part of the record was just warm as recent years, if UHI adjustments were made.

Unfortunately, it is not obvious how to make such adjustments accurately. It must be remembered that the 2nd plot above only shows the relative UHI warming of higher population stations compared to the lower population stations, and previous studies have suggested that even the lower population stations experience warming as well. In fact, published studies have shown that most of the spurious UHI warming is observed early in population growth, with less warming as population grows even larger.

Again, what is different about the above dataset is it is based upon temperature observations made 4x/day, always at the same time, so there is no issue with changing time-of-observation, as there is with the use of Tmax and Tmin data.

Of course, all of this is preliminary, and not ready for peer review. But it is interesting.

U.S. Surface Temperature Update for July, 2012: +1.11 deg. C
August 6th, 2012

The U.S. lower-48 surface temperature anomaly from my population density-adjusted (PDAT) dataset was 1.11 deg. C above the 1973-2012 average for July 2012, with a 1973-2012 linear warming trend of +0.145 deg. C/decade (click for full-size version):

image
Enlarged

I could not compute the corresponding USHCN anomaly this month because it appears the last 4 years of data in the file is missing (here). Someone please correct me if I am mistaken.

Note that the 12-month period ending in July 2012 is also the warmest 12-month period in the 40 year record. I cannot compare these statistics to the (possibly warmer) 1930s because for the most part only max and min temperatures were reported back then, and my analysis depends upon 4x/day observations at a specific synoptic reporting times.

There is also no guarantee that my method for UHI adjustment since 1973 has done a sufficient job of removing UHI effects. A short description of the final procedure I settled on for population density adjustment of the surface temperatures can be found here.


Aug 05, 2012
Which Causes which out of Atmospheric Temperature and CO2 content?

By Ray Tomes, Cycles research Institute

Over very long periods of time as ice ages come and go, it has been found that temperature leads atmospheric CO2 content by about 800 years. This seems to contradict the IPCC and other views that CO2 causes change in temperature. But we are looking at very different time scales with present changes, so perhaps things happen differently. I decided to examine this question.

The temperature data used is monthly global land-ocean temperature or GHCN, which is available from NOAA. The atmospheric CO2 data used is from Mauna Loa in Hawaii, the longest continuous record of CO2 also available monthly.

When wanting to find the causation when two series are both increasing over time, it is best to look at the rate of change of the variables as this will show clearly which one precedes the other. This first graph shows the rate of change of these two variables monthly over the period 1958 to 2009.

image
Rate of change of atmospheric CO2 content and land-ocean temperature (enlarged)

Both monthly series were processed in the same way. The change over a 12 month was calculated, and a 12 month simple moving average of these values was used to avoid all seasonal effects. That data was plotted at the centre of the 23 months values used in the calculation. Because the treatment was the same for both variables, they are directly comparable.

It can be seen that there is generally a good correlation, with nearly all peaks in one variable having similar peaks in the other. When one has a smaller peak such as around 1975, then so does the other. When one has a larger peak around 1973 or 1998, then so does the other. there are one or two minor variations from this.

It is also evident that the red temperature graph generally precedes the black CO2 graph on turning points.  This suggests that temperature drives CO2 and not the other way around. A comparison of the two series at different lags gives this second graph.

image
Correlation between rate of change of global temperature and rate of change of atmospheric CO2 content (enlarged)

When the two series are coincident the correlation is quite small, r=0.13, whereas when temperature change 6 months earlier is compared to to CO2 change there is a maximum correlation of r=0.42 which is a high correlation for short period changes which have a high noise content. There is no high correlation for any lag when CO2 precedes temperature, the best being r=0.15 at 42 months.

It seems that, contrary to popular wisdom, temperature changes are driving atmospheric CO2 content changes, with a lag time of 6 months.


Aug 02, 2012
Inhofe Exposes Another Epic Fail by Global Warming Alarmists

EPW Minority

Katie Brown Katie_Brown@epw.senate.gov (202) 224-2160
Inhofe Exposes Another Epic Fail by Global Warming Alarmists

image
Photo Posted by KFOR and Think Progress

The dumpster fire that caused the melting lights

Link to Think Progress Blog Post

Link to Watts Up With That: Alarmist fact checking - street lights don’t melt at 115F

Link to Press Release

Washington, D.C. - Today the far-left blog Think Progress posted a photo (originally posted on KFOR’s facebook page) of street lights in Oklahoma that had melted, they claimed, because of extreme heat.  Global warming alarmist Bill McKibben took to Twitter immediately to publicize what he believed to be proof of global warming, tweeting to Senator James Inhofe (Okla.), Ranking Member of the Senate Committee on Environment and Public Works, “Senator Inhofe, God may be trying to get your attention. Check out this picture.”

Not long after the picture surfaced, Oklahomans posted comments on Think Progress’ blog saying that these lights had melted due to a fire - which makes sense considering that the two front lights were melted while the two back lights remained unscathed.  Once this news came to light, Think Progress immediately removed the post and provided an update that reads:  “After we published this piece, we saw reports from people on the ground in Stillwater that the melting streetlights were due to a nearby fire. The person who took the photo, Patrick Hunter, described the scene: ‘Being the person that actually took this photo, I’d say that this was due to a fire semi-close by coupled with the unbelievable heat we are experiencing.’ Still an amazing photo and not fake as many are saying on here. Enjoy!’

This afternoon, KFOR confirmed that the melted lights in the photo were not caused by hot temperatures but a nearby dumpster fire.

“Poor Bill McKibben - he’s been trying to get something to melt for ages but it keeps backfiring,” Senator Inhofe said.  “These alarmists never learn their lesson.  Remember Bill McKibben was the one who was going to melt a giant ice sculpture in the shape of the word “hoax” on the national mall, but his group had to cancel because there wasn’t enough interest.  Now, after proclaiming that street lights in Oklahoma are melting because of global warming, we have confirmation that a fire caused this scene. 

“Amid the resurgence of hysteria from my friends on the left, I appreciated climatologist Dr. John Christy who testified this week before the Environment and Public Works committee saying that instead of proclaiming this summer is ‘what global warming looks like’ it is ‘scientifically more accurate to say that this is what Mother Nature looks like, since events even worse than these have happened in the past before greenhouse gases were increasing like they are today.’

“This isn’t the first time alarmists have tried these stunts and it certainly won’t be the last - when will they finally realize they’ve lost this debate?”


Aug 01, 2012
Team Obama fines oil companies for not using fantasy fuel

By Deroy Murdock

It’s been said the difference between fact and fiction is that fiction has to make sense. Nowhere is this more true than in the realm of US EPA regulation.
This article by Deroy Murdock discusses EPA’s truly bizarre policy of fining the oil companies for not using a biofuel that does not exist. You cannot possibly invent this stuff.

Thank you for posting it, quoting from it, and forwarding it to your friends and colleagues.
Paul Dreissen

Team Obama fines oil companies for not using fantasy fuel

There is no such thing as “cellulosic” ethanol, but EPA fines companies for not using it

Why does America’s economy feel like an SUV that is running on fumes? The Obama Administration’s laughably rigid enforcement of a Baby Bush-era ethanol mandate typifies today’s regulatory climate. When Uncle Sam governs with a tire iron in his hand, U.S. companies wisely pull off the road and pray for new management.

The Environmental Protection Agency has slapped a $6.8 million penalty on oil refiners for not blending cellulosic ethanol into gasoline, jet fuel and other products. These dastardly petroleum mongers are being so intransigent because cellulosic ethanol does not exist. It remains a fantasy fuel. The EPA might as well mandate that Exxon hire Leprechauns.

As a screen shot of EPA’s renewable fuels website confirms, so far this year - just as in 2011 - the supply of cellulosic biofuel in gallons totals zero.

“EPA’s decision is arbitrary and capricious. We fail to understand how EPA can maintain a requirement to purchase a type of fuel that simply doesn’t exist,” stated Charles Drevna, president of American Fuel & Petrochemical Manufacturers (AFPM), the Washington-based trade association that represents the oil refining and petrochemicals industries.

“We’ll fund additional research in cutting-edge methods of producing ethanol, not just from corn but from wood chips and stalks or switch grass,” President G.W. Bush said in his 2006 State of the Union address. “Our goal is to make this new kind of ethanol practical and competitive within six years.”
So, in 2007, Bush idiotically signed the Energy Independence and Security Act. Beyond prohibiting Thomas Edison’s ground-breaking incandescent light bulb by 2014, EISA’s Renewable Fuel Standard mandated cellulosic ethanol.

Under the RFS, refiners had to blend 6.6 million gallons of cellulosic ethanol in 2011. Although this substance is not extant, EPA then demanded to see 31 percent more of it. This year’s quota is 8.65 million gallons.

EPA still expects cellulosic ethanol to leap magically from test tubes into storage tanks. While EPA has lowered its original targets as each year rolls around (for example, 2012’s “volumetric requirements” were originally set at 500 million gallons), its compulsory quantities remain enormous for the next 10 years, as the following table shows. 

Year Billions of gallons Mandated Volume
2013 1.00 --
2014 1.75 75.0
2015 3.00 71.4
2016 4.25 41.7
2017 5.50 29.4
2018 7.00 27.3
2019 8.50 21.4
2020 10.50 23.5
2021 13.50 28.6
2022 16.00 18.5

Presidents Bush and Obama have pumped some $1.5 billion in grants and guarantees into converting cellulosic ethanol from dream into reality. As Thomas Pyle of the Institute for Energy Research reports, Team Obama handed a $105 million loan guarantee to POET, “the world’s largest ethanol producer,” to create cellulosic fuel. Last September, Abengoa Energy scored a $134 million loan to build a Kansas cellulosic factory. Last August, Obama gave the Navy $510 million to develop biofuels for the U.S. armed forces.

Way back in 2010, some 70 percent of fantasy fuel was supposed to spring from Cello Energy in Alabama. Unfortunately, in 2009, a jury determined that Cello falsified its production capacity. Cello went silent in October 2010 when it filed for bankruptcy.

The National Academy of Sciences announced last year that “currently, no commercially viable bio-refineries exist for converting cellulosic biomass to fuel.” NAS further predicted that in 2022, EPA’s mandated cellulosic supplies will not materialize “unless innovative technologies are developed that unexpectedly improve the cellulosic biofuels production process.” In other words, if you don’t build it, they will not come.

A Wall Street Journal editorial perfectly encapsulated this fine mess.

“Congress subsidized a product that didn’t exist, mandated its purchase though it still didn’t exist, is punishing oil companies for not buying the product that doesn’t exist, and is now doubling down on the subsidies in the hope that someday it might exist.”

The oil refiners absorbed all of this and chose, at first, to play nice. AFPM and the American Petroleum Institute petitioned EPA in February 2011 and again on January 20, 2012 - the second time joined by the Western States Petroleum Association. As the administration gave labor unions and entire states waivers from ObamaCare, the refiners asked for waivers from the RFS mandate.

Fully 15 months after the first petition and four months beyond the second, EPA administrator Lisa Jackson finally rejected the refiners’ appeals, reaffirming that they must obey this regulation - never mind that they more easily could defy gravity. “We thank you for your interest in these issues,” Jackson’s May 22 letter cheerily added.

Thus, on June 11, AFPM and WSPA sued EPA in DC Circuit Court. The plaintiffs hope a federal judge will blend some sanity into a scenario that resembles the work of Salvador Dali.

Rather than focus on expanding operations and creating jobs, lawful American companies now must spend money to sue the federal government for relief from unobservable rules. This fact demonstrates how bone-headed and bull-headed Washington has become. Even worse, business people beyond the oil industry watch this charade and wonder when the regulatory tumbrels will roll by for them.

“This doesn’t help. On the margin, this spooks business people,” says economist Tom Landstreet, founder and CEO of Standard Research Corporation, a Nashville-based investment analysis firm. The former colleague of supply-side legend Arthur Laffer adds: “This is part of a pervasive cluelessness about the economy and markets. It’s just one of a thousand cuts.”

Company owners these days ask themselves “How does one become a favored industry or business under this administration, versus being one that is vilified and demonized,” says AFPM’s Drevna. If a particular enterprise is “not part of that inner circle,” he continues, “they might go to Singapore or somewhere with a more business-friendly atmosphere.”

Drevna applauds recent comments by Governor Bobby Jindal (R-Louisiana). “I suspect that many in the Obama Administration don’t really believe in private enterprise. At best, they see business as something to be endured so that it can provide tax money for government programs,” Jindal wrote in a June 14 RedState.com op-ed. “The problem is that the private sector is so foreign to our President that he would need a passport to go there and a translator to understand what is happening.”

Washington’s unyielding, heavy-handed, and nonsensical behavior nonetheless may obscure a sliver of silver lining. The Bush-Obama Administration indeed has invented a hybrid fuel: cellulosic ethanol is one half industrial policy and one half comedy routine.

New York commentator Deroy Murdock is a nationally syndicated columnist with the Scripps Howard News Service and a media fellow with the Hoover Institution on War, Revolution and Peace at Stanford University. This article originally appeared on National Review Online, in July 2012.

image
Screen shot (enlarged) of EPA summary of RINs and volume of a fuel that exists only in its wildest dreams:


Jul 31, 2012
Cooler Heads Coalition action alert: Senate Finance Committee moving to extend the wind PTC

By Myron Ebell

We have just learned that the Senate Finance Committee may mark up a bill on Thursday that includes a one-year extension of the wind production tax credit.  A more detailed alert will follow with action items after we have learned more.  Note that the Romney campaign yesterday issued a statement opposing an extension of the wind PTC.  I have pasted that statement below.  Also note that the Finance Committee includes several Republicans who support wind subsidies.  I have pasted all members of the committee below. 

Official statement from Romney Campaign:

“President Obama’s promise to ‘easily’ create 5 million green energy jobs has become a particularly depressing punchline amidst the endless disappointments of the last four years. The President spent $90 billion in taxpayer stimulus dollars, some of which went to his donors and political allies or was sent to create jobs overseas instead of here in America.  Now we have American wind and solar energy sectors that combine to produce only one percent of our energy - and our wind industry has actually lost 10,000 jobs.”

“The President may believe that his economic plan ‘worked’ and that America wants to repeat the experience for another four years, but the facts don’t back that up.  Mitt Romney believes it is a time for a new approach to ensure our nation’s energy independence. He will allow the wind credit to expire, end the stimulus boondoggles, and create a level playing field on which all sources of energy can compete on their merits. Wind energy will thrive wherever it is economically competitive, and wherever private sector competitors with far more experience than the President believe the investment will produce results.”

Baucus, Max (MT) , Chairman
Rockefeller, John D. (WV)
Conrad, Kent (ND)
Bingaman, Jeff (NM)
Kerry, John F. (MA)
Wyden, Ron (OR)
Schumer, Charles E. (NY)
Stabenow, Debbie (MI)
Cantwell, Maria (WA)
Nelson, Bill (FL)
Menendez, Robert (NJ)
Carper, Thomas R. (DE)
Cardin, Benjamin L. (MD)

Hatch, Orrin G. (UT), Ranking Member
Grassley, Chuck (IA)
Snowe, Olympia J. (ME)
Kyl, Jon (AZ)
Crapo, Mike (ID)
Roberts, Pat (KS)
Enzi, Michael B. (WY)
Cornyn, John (TX)
Coburn, Tom (OK)
Thune, John (SD)
Burr, Richard (NC)

Contact the committee and tell them to stop wasteful spending.


Page 21 of 159 pages « First  <  19 20 21 22 23 >  Last »