By Marlo Lewis, Pajamas Media
On Thursday, Sen. Lisa Murkowski (R-AK), ranking member of the Senate Energy and Natural Resources Committee, introduced a resolution of disapproval, under the Congressional Review Act (CRA), to overturn EPA’s endangerment finding (the agency’s official determination that greenhouse gas emissions endanger public health and welfare) . Murkowski’s floor statement and a press release are available here.
The resolution has 38 co-sponsors, including three Democrats (Blanche Lincoln of Arkansas, Ben Nelson of Nebraska, and Mary Landrieu of Louisiana). If all 41 Senate Republicans vote for the measure, Sen. Murkowski will need only seven additional Democrats to vote “yes” to obtain the 51 votes required for passage. (Under Senate rules, a CRA resolution of disapproval cannot be filibustered and thus does not need 60 votes to ensure passage.)
Murkowski’s resolution of disapproval is a gutsy action intended to safeguard the U.S. economy, government’s accountability to the American people, and the separation of powers under the Constitution. Naturally, Sen. Barbara Boxer and other apostles of Gorethodoxy denounce it as an assault on the Clean Air Act, public health, science, and “the children.”
Rubbish!
At a press conference she organized on Thursday, Boxer employed an old rhetorical trick - when you can’t criticize your opponent’s proposal on the merits, liken it to something else that is plainly odious and indefensible. She said:
Imagine if in the 1980s the Senate had overturned the health finding that nicotine in cigarettes causes lung cancer. How many more people would have died already? Imagine if a senator got the votes to come to the floor to overturn the finding that lead in paint damages children’s brain development? How many children and families would have suffered? Imagine if the senator had come down to the floor and said, you know, I don’t think black lung disease is in any way connected to coal dust. Imagine!
Note that all the outrages Boxer is describing are imaginary. Murkowski is not proposing to question the link between cigarette smoke and lung cancer, etc. More to the point, she is not questioning the linkage between greenhouse gas emissions and climate change. Nor is she questioning the validity of EPA’s endangerment finding (even though there are strong scientific reasons for doing so). In fact, Sen. Murkowski supports legislation to control greenhouse gas emissions (her floor statement and legislative record leave no doubt on these points).
What Murkowski opposes is EPA dealing itself into a position to control the U.S. economy without “any input” from the people’s elected representatives. The endangerment finding compels EPA to regulate carbon dioxide (CO2) from new motor vehicles, which then in turn obligates EPA to apply Clean Air Act pre-construction and operating permit requirements to millions of small businesses. The endangerment finding also establishes a precedent for economy-wide regulation of greenhouse gases under the National Ambient Air Quality Standards (NAAQS) program.
The Murkowski resolution addresses a basic conflict of interest that Sen. Boxer prefers to sweep under the rug. Under the Clean Air Act, the agency that makes the findings that trigger regulatory action is the same agency that does the regulating. Since regulatory agencies exist to regulate, they have a vested interest in reaching “scientific” conclusions that expand the scope and scale of their power.
Up to now, this ethically flawed situation has been tolerable because Congress has clearly specified the types of substances over which EPA has regulatory authority - those that degrade air quality, those that pose acute risks of toxicity, or those that deplete the ozone layer. But when Congress enacted and amended the Clean Air Act, it never intended for EPA to control greenhouse gases for climate change purposes.
Yes, it is possible, by torturing the text of the Clean Air Act as the Supreme Court did in Massachusetts v. EPA, to infer congressional authority for greenhouse gas regulation. But the fact remains that Congress did not design the Clean Air Act to be a framework for climate policy, has never voted for the Act to be used as such a framework, and has never signed off on the regulatory cascade that EPA’s endangerment finding, if allowed to stand, will ineluctably trigger.
According to the Washington Post, Boxer stated that if the public has to wait for Congress to pass legislation to control greenhouse gas emissions, “that might not happen, in a year or two, or five or six or eight or 10.” Yes, but that’s democracy. And the democratic process is more valuable than any result that EPA might obtain by doing an end run around it.
Since the Progressive Era, our country has increasingly lived under a constitutionally dubious system of regulation without representation. Regulations have the force and effect of law, and many function as implicit taxes. Article I of the Constitution vests all legislative powers, such as the power to tax, in Congress. For decades, however, Congress has enacted statutes that delegate legislative power to agencies that are not accountable to the people at the ballot box. Constitutionally, the only saving grace is that the regulations implement policies clearly authorized in the controlling statute.
But the regulatory cascade that will ensue from EPA’s endangerment finding has no clear congressional authorization. Indeed, regulations emanating from the endangerment finding are likely to be more costly and intrusive than any climate bill Congress has considered and either rejected or failed to pass.
We are on the brink of an era of runaway regulation without representation. Sen. Boxer complains that the Murkowski resolution is “unprecedented.” But that is only fitting, because the resolution addresses an unprecedented threat to our system of self-government. Read post and comments here.
By Marc Sheppard, American Thinker
Not surprisingly, the blatant corruption exposed at Britain’s premiere climate institute was not contained within the nation’s borders. Just months after the Climategate scandal broke, a new study has uncovered compelling evidence that our government’s principal climate centers have also been manipulating worldwide temperature data in order to fraudulently advance the global warming political agenda.
Not only does the preliminary report [PDF] indict a broader network of conspirators, it challenges the very mechanism by whiwhich global temperatures are measured, published, and historically ranked.
Last Thursday, Certified Consulting Meteorologist Joseph D’Aleo and computer expert E. Michael Smith appeared together on KUSI TV [Video] to discuss the Climategate—American Style scandal they had discovered. This time out, the alleged perpetrators are the National Oceanic and Atmospheric Administration (NOAA) and the NASA Goddard Institute for Space Studies (GISS).
NOAA stands accused by the two researchers of strategically deleting cherry-picked cooler-reporting weather observation stations from the temperature data it provides the world through its National Climatic Data Center (NCDC). D’Aleo explained to show host and Weather Channel founder John Coleman that while the Hadley Center in the U.K. has been the subject of recent scrutinywhich global temperatures are measured, published, and historically ranked.
Last Thursday, Certified Consulting Meteorologist Joseph D’Aleo and computer expert E. Michael Smith appeared together on KUSI TV [Video] to discuss the Climategate—American Style scandal they had discovered. This time out, the alleged perpetrators are the National Oceanic and Atmospheric Administration (NOAA) and the NASA Goddard Institute for Space Studies (GISS).
NOAA stands accused by the two researchers of strategically deleting cherry-picked cooler-reporting weather observation stations from the temperature data it provides the world through its National Climatic Data Center (NCDC). D’Aleo explained to show host and Weather Channel founder John Coleman that while the Hadley Center in the U.K. has been the subject of recent scrutiny:
“We think NOAA is complicit if not the real ground zero for the issue”
And their primary accomplices are the scientists at GISS, who put the altered data through an even more biased regimen of alterations, including intentionally replacing the dropped NOAA readings with those of stations located in much warmer locales.
As you’ll soon see - the ultimate effects of these statistical transgressions on the reports which influence climate alarm and subsequently world energy policy are nothing short of staggering.
NOAA - Data In / Garbage Out
Although satellite temperature measurements have been available since 1978, most global temperature analyses still rely on data captured from land-based thermometers, scattered more-or-less about the planet. It is that data which NOAA receives and disseminates - although not before performing some sleight-of-hand on it.
Smith has done much of the heavy lifting involved in analyzing the NOAA/GISS data and software, and he chronicles his often frustrating experiences at his fascinating website. There, detail-seekers will find plenty of them, divided into easily-navigated sections—some designed specifically for us “geeks,” but most readily approachable to readers at all technical strata.
Perhaps the key point discovered by Smith was that by 1990, NOAA had deleted from its datasets all but 1500 of the 6000 thermometers in service around the globe. Now, 75% represents quite a drop in sampling population, particularly considering that these stations provide the readings used to compile both the Global Historical Climatology Network (GHCN) and United States Historical Climatology Network (USHCN) datasets. The same datasets, incidentally, which serve as primary sources of temperature data not only for climate researchers and universities worldwide, but also for the many international agencies using the data to create analytical temperature anomaly maps and charts.
Yet, as disturbing as the number of dropped stations was, it was the nature of NOAA’s “selection bias” that Smith found infinitely more troubling. It seems that stations placed in historically cooler, rural areas of higher latitude and elevation were scrapped from the data series in favor of more urban locales at lower latitudes and elevations. Consequently, post-1990 readings have been biased to the warm side not only by selective geographic location, but also the anthropogenic heating influence of a phenomenon known as the Urban Heat Island Effect (UHI).
For example, Canada’s reporting stations dropped from 496 in 1989 to 44 in 1991 with the percentage of stations at lower elevations tripling while the numbers of those at higher elevations dropped to 1. That’s right, as Smith wrote in his blog, they left “one thermometer for everything north of LAT 65.” And that one resides in a place called Eureka, which has been described as “The Garden Spot of the Arctic,” due to its unusually moderate summers.
GISS - Garbage In / Globaloney Out
The scientists at NASA’s GISS are widely considered to be the world’s leading researchers into atmospheric and climate changes. And their Surface Temperature (GISTemp) analysis system is undoubtedly the premiere source for global surface temperature anomaly reports.
In creating its widely disseminated maps and charts, the program merges station readings collected from the Scientific Committee on Antarctic Research (SCAR) with GHCN and USHCN data from NOAA. It then puts the merged data through a few “adjustments” of its own. First, it further “homogenizes” stations, supposedly adjusting for UHI by - according to NASA—changing “the long term trend of any non-rural station to match the long term trend of their rural neighbors, while retaining the short term monthly and annual variations.” Of course, the reduced number of stations will have the same effect on GISS’s UHI correction as it did on NOAA’s discontinuity homogenization - the creation of artificial warming. In fact, throughout the entire grid, cooler station data are dropped and “filled in” by temperatures extrapolated from warmer stations in a manner obviously designed to overestimate warming.
Government and Intergovernmental Agencies—Globaloney In / Green Gospel Out
Smith attributes up to 3F (more in some places) of added “warming trend” between NOAA’s data adjustment and GIStemp processing. That’s over twice last century’s reported warming. And yet, not only are NOAA’s bogus data accepted as green gospel - so are its equally bogus hysterical claims, like this one from the 2006 annual State of the Climate in 2005 [PDF]:
“Globally averaged mean annual air temperature in 2005 slightly exceeded the previous record heat of 1998, making 2005 the warmest year on record.”
And as D’Aleo points out in the preliminary report, the recent NOAA proclamation that June 2009 was the second warmest June in 130 years will go down in the history books, despite multiple satellite assessments ranking it as the 15th coldest in 31 years.
Even when our own National Weather Service (NWS) makes its frequent announcements that a certain month or year was the hottest ever, or that five of the warmest years on record occurred last decade, they’re basing such hyperbole entirely on NOAA’s warm-biased data. And how can anyone possibly read GISS chief James Hansen’s Sunday claim that 2009 was tied with 2007 for second warmest year overall and the Southern Hemisphere’s absolute warmest in 130 years of global instrumental temperature records without laughing hysterically? Especially considering that NOAA had just released a statement claiming that very same year - 2009—to be tied with 2006 for the fifth warmest year on record.
So, how do alarmists reconcile one government center reporting 2009 as tied for second while another had it tied for fifth? If you’re WaPo’s Andrew Freedman, you simply chalk it up to “different data analysis methods” before adjudicating both NASA and NOAA innocent of any impropriety based solely on their pointless assertions that they didn’t do it.
Earth to Andrew: “Different data analysis methods?” Try replacing “analysis” with “manipulation” and ye shall find enlightenment. More importantly, does the explicit fact that since the drastically divergent results of both “methods” can’t be right both are immediately suspect somehow elude you?
But by far the most significant impact of this data fraud is that it ultimately bubbles up to the pages of the climate alarmists’ bible: The United Nations Intergovernmental Panel on Climate Change Assessment Report. And wrong data begets wrong reports which—particularly in this case—begets dreadfully wrong policy. Read much more here.
-----------------------
After Copenhagen, Banks and Investors Pulling Out of Carbon Markets
UK Guardian
Banks and investors are pulling out of the carbon market after the failure to make progress at Copenhagen on reaching new emissions targets after 2012.
Carbon financiers have already begun leaving banks in London because of the lack of activity and the drop-off in investment demand. The Guardian has been told that backers have this month pulled out of a large planned clean-energy project in the developing world because of the expected fall in emissions credits after 2012.
Anthony Hobley, partner and global head of climate change and carbon finance at law firm Norton Rose, said: “People will gradually start to leave carbon desks, we are beginning to see that already. We are seeing a freeze in banks’ recruitment plans for the carbon market. It’s not clear at what point this will turn into a cull or a rout.”
Paul Kelly, chief executive of EcoSecurities, which develops clean energy projects, said that while markets had not expected a definitive post-Kyoto Protocol
“The lack of regulatory certainty in the post 2012 world affects the market’s view of what CERs [carbon credits from clean energy projects] will be worth and subsequently will constrain financing for projects. If you had an agreement at Copenhagen with a bit more detail, people would be more willing to take risk.”
After two weeks of extenuating talks, world leaders delivered an agreement in Copenhagen that left campaigners disappointed as it failed to commit rich and poor countries to any greenhouse gas emission reductions.
Banks had been scaling back their plans to invest in carbon markets before Copenhagen. Fewer new clean energy projects need to be financed as, because of the recession, there are fewer global emissions to offset. The price of carbon credits has also fallen, while plans to introduce national trading schemes, particularly in the US and Australia, remain uncertain. Full story here.
By Roger Pielke Sr., Climate Science Weblog
There is a post on the Nature website Climate Feedback by Olive Heffernan titled “AMS2010: Data gaps and errors may have masked warming”
This is a remarkable post in that it fails to properly assess all of the data sources for climate system heat changes. Excerpts from the post read:
“New analyses provide preliminary evidence that temperature data from the UK Met office may under-estimate recent warming. That’s the conclusion of a talk given here today by Chris Folland of the Met Office Hadley Centre. Folland says that there is a very good chance that there has been more warming over land and over the ocean in the past decade than suggested by conventional data sets, but he says that the issues with land and ocean data are entirely unrelated.”
“For land, the problem of underestimating warming stems from data gaps in the average monthly temperature data set of the Met Office Hadley Centre, known as HadCruT3. Temperatures over the past decade were recently re-analyzed using a european climate model by Adrian Simmons of the European Centre for Medium-Range Weather Forecasts in Reading, UK and colleagues, and are soon to be published in the Journal of Geophysical Research [subscription]. Simmons and colleagues compared air temperature and humidity data collected over the past decade by the Hadley Centre with re-analyzed data for the same period. Average warming over land was larger for the fully sampled re-analyzed data than for the HadCRUT3 temperature data. The difference between the data sets is particularly notable for northeast Canada, Greenland and nothern parts of Asia, areas which are warming particularly rapidly.”
If the land surface temperatures were actually warmer than have been sampled, this results in even more divergence between the surface temperature and lower tropospheric temperature trends which we quantified in
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.
Chris Folland also ignored the unresolved issues and systematic biases that we identified in our paper:
Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229.
The Heffernan weblog post further writes
“For the ocean data, it’s a different issue. John Kennedy of the Met Office and colleagues previously reported in Nature [subscription] that changes in the methods used to collect sea surface temperature (SST) data at the end of World War II caused problems in comparing pre- and post-war data. Now they have a new analysis (yet to be published) suggesting that smaller changes in data collection methods since the end of the war could also be significant.
Over the past 20 years, the primary source of SST data has changed from ships to ocean buoys. Because ships warm the water during data collection, there has been a drop in recorded SSTs since bouys, which are more accurate, became the main data source. So what could appear to be a relative cooling trend in SSTs over the past decade may actually just due to changes in errors in the data. Scientists are confident that the buoy data are more accurate because they compare favourably with reliable satellite data.”
The upper ocean heat data shows no appreciable warming in the upper ocean since at least 2005 (and perhaps since 2003) as I discussed in my paper:
Pielke Sr., R.A., 2008: A broader view of the role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.
The satellite monitored surface temperatures similarly show a lack of warming over this time period; the current global sea surface temperature trends can be viewed at the GISS website (see) where for the period 2003 to 2009 on the annual average, there is a even negative trend in this time period for some latitude bands (see) [see also the land and ocean temperature changes figure in the section “Annual Mean Temperature Change for Land and Ocean here where a divergence between the land and ocean data trens in the last 10 years is quite distinct].
While, whether the trends are positive or negative from 2003 to 2009 does not refute a longer time global warming (which could, of course, recommence), statements by Chris Folland and John Kennedy that can be easily shown to conflict with even a cursory examination of the data, will result in a dismissal of their conclusions by objective climate scientists. See post here.
Icecap Note: Folland admits their data is in error but remarkably reaches the conclusion that it errs on the cool side re-analyzed using a European climate model. To the alarmists, climate models are the truth and data if it doesn’t agree must be made to conform to them. Count on GHCN V3 coming this year to show more warming as NOAA has produced in the USHCNv2 in the blink charts here and here.
