Recent revelations from the Climategate emails, originating from the Climatic Research Unit at the University of East Anglia showed how all the data centers, most notably NOAA and NASA, conspired in the manipulation of global temperature records to suggest that temperatures in the 20th century rose faster than, in reality, they actually did.
This has inspired climate researchers worldwide to take a hard look at the data proffered by comparing it to the original data and to other data sources. This report compiles some of the initial alarming findings.
There has clearly been some cyclical warming in recent decades, most notably 1979 to 1998. However, the global surface-station data is seriously compromised. First, there is a major station dropout and increase in missing data in stations that remained which occurred suddenly around 1990; about the time the global warming issue was being elevated to importance in political and environmental circles. A clear bias was found towards removing cooler higher elevation, higher latitude, and rural stations during this culling process though leaving their data in the base periods from which ‘averages’ and anomalies are computed.
The data also suffers contamination by urbanization and other local factors such as land-use/land-cover changes and improper siting. There are uncertainties in ocean temperatures; no small issue, as oceans cover 71% of the earth’s surface.
These factors all lead to significant uncertainty and a tendency for overestimation of century-scale temperature trends. A conclusion from all findings suggest that global data bases are seriously flawed and can no longer be trusted to assess climate trends or rankings or validate model forecasts. And, consequently, such surface data should be ignored for decision making.
This SPPI sponsored working paper by Joe D’Aleo of Icecap and Anthony Watts of Watts Up With That looks at the many issues associated with NOAA and NASA temperature data.
NOAA has already responded to the preliminary paper for John Coleman’s KUSI special through the Yale Climate Forum:
However, as Thomas Peterson and Russell Vose, the researchers who assembled much of GHCN, have explained:
“The reasons why the number of stations in GHCN drop off in recent years are because some of GHCN’s source datasets are retroactive data compilations (e.g., World Weather Records) and other data sources were created or exchanged years ago. Only three data sources are available in near-real time.
It’s common to think of temperature stations as modern Internet-linked operations that instantly report temperature readings to readily accessible databases, but that is not particularly accurate for stations outside of the United States and Western Europe. For many of the world’s stations, observations are still taken and recorded by hand, and assembling and digitizing records from thousands of stations worldwide is burdensome.
During that spike in station counts in the 1970s, those stations were not actively reporting to some central repository. Rather, those records were collected years and decades later through painstaking work by researchers. It is quite likely that, a decade or two from now, the number of stations available for the 1990s and 2000s will exceed the 6,000-station peak reached in the 1970s.”
So rest assured, we can make trillion dollar decisions today based on imperfect, tainted data and correct it in two decades when we painstakingly gather the missing data for the 1990s through 2010s and fix it. Our government at work.
See also SPPI’s December Monthly CO2 Report by Christopher Monckton here.
See this interesting analysis that shows why interpolating large distances to estimate temperatures is fraught with error here. See E.M. Smith’s “Temperatures now compared to maintained GHCN” where he responds to NCDC claims that gathering data is hard here.
-----------------------------------
NASA, NOAA create global warming trend with cooked data
By Kirk Myers, Seminole County Environmental Examiner
The cooks - er, “scientists” - at NASA’s Goddard Institute of Space Studies (GISS) have released their latest sky-is-falling temperature findings, and they show 2009 as the second-warmest year for the planet since modern record-keeping began in 1880, and 2009 temperatures in the Southern Hemisphere the warmest since 1880.
The head chefs at NOAA’s National Climatic Data Center (NCDC), working from a slightly different recipe, reveal a 2009 that was not as well done as NASA’s, ranking only the fifth-warmest since 1880.
Other not-so-well-done findings:
NASA cooks:
January 2000 to December 2009 was the warmest decade on record. Surface temperatures have been trending upward 0.38 degrees per decade during the last three decades. In total, average global temperatures have increased by about 1.4 degrees since 1880 ("an important number to keep in mind,” says NASA climatologist Gavin Schmidt.)
NOAA chefs:
Global land and ocean annual surface temperatures for 2009 tied with 2006 as the fifth-warmest on record, at 1.01 degrees about the 20th century average. The 2000 - 2009 decade is the warmest on record, with average global surface temperature of 0.96 degrees above the 20th century average.
Land surface temperatures in 2009 tied with 2003 as the seventh-warmest on record, at 1.39 degrees above the 20th century average. The years 2001- 2008 each rank among the 10 warmest years since 1880.
How do they arrive at the numbers?
There is a major problem with the NASA and NOAA numbers, according to skeptical researchers who have dissected the data: They are inaccurate, the result of cherry-picking, computer manipulation and “best guess” interpretation. The agency kitchens have concocted warming global temperatures using a hard-to-follow recipe of thinning reporting stations, grid-box interpolation, temperature homogenization, and algorithmic ingredients blended into a tweak-on-the-fly computer program.
Veteran meteorologist Joe D’Aleo - a long-time critic of official global-warming statistics - says NASA and NOAA are manipulating the data, calling their actions the U.S. version of last year’s Climategate scandal. The Climategate brouhaha ensued when thousands of hacked (or whistleblower) e-mails were uploaded last November to the Web from Britain’s Climate Research Unit (CRU) in East Anglia and seen by millions of fascinated snoops. The revealing missives exposed scientific misconduct by top climate scientists and researchers, several of whom are now under investigation in Britain and the United States.
“The CRU has been ground zero for alleged scientific conduct, but other national weather centers, organizations, universities, and the U.S. global data centers at NOAA and NASA are complicit in the misrepresentation or manipulation of data to support the supposed [global warming] consensus,” says D’Aleo, who also heads ICECAP, the International Climate and Environmental Change Assessment Project.
What warming?
NASA and NOAA have several data-manipulation tricks in their global warming cookbook. But before eyeballing their recipe, here are few inconvenient truths that failed to make their way into the NASA and NOAA press releases. According to D’Aleo: “Global temperatures peaked in late 1990s, leveled off, and have been declining since 2001. All five official databases - NOAA’s NCDC, NASA GISS, Hadley CRU, University of Alabama at Birmingham (UAH), and Remote Sensing Systems (RSS) - confirm the decline.”
Satellite data and land-ocean data sets are diverging. Last year, NOAA announced that June global temperatures ranked the second-warmest in 130 years, while the two satellite data sources—UAH and RSS --ranked the month’s temperatures the 15th and 14th coldest, respectively, in 31 years of record-keeping. In 2009, all regions of the United States were normal or below normal except for the Southwest and Florida, according to the NCDC.
The annual temperature in 2008 was the coolest since 1997, according to NOAA. Thirty-seven of 50 states set their all-time record temperatures in the decades prior to 1960.
According to the NCDC, 1936 experienced the hottest overall summer on record in the continental United States. In fact, out of 50 states, 24 recorded their all-time high temperature during the 1930s. The 2000s had the most benign weather, in terms of records (heat and cold), of any decade since the 1880s.
According to the Danish Meteorological Institute, arctic temperatures are currently below minus 31.27 degrees, more than five degrees below normal and the lowest since 2004. Last year, Chicago experienced its coolest July 8 in 118 years, and only four days during the summer reached the 90s. Six states experienced their coldest-ever July in 115 years, four the second coldest and two the third coldest. October was the third coldest and December the 14th coldest in the United States in 115 years.
Global warming and cooling are cyclical. Data show it warmed from 1920 to 1940 and again from 1979 to 1998. But temperatures cooled from 1940 to the late 1970s, and have been cooling since 2001. Since World War II, CO2 has risen, even as temperatures have cooled, warmed and then cooled again, undermining the theory that CO2 is the single most important cause of climate change.
Not a single official computer model predicted the recent decline (since 2001) in global temperatures. Yet extended projections from the same models are referenced by eco-alarmists demanding draconian CO2-emission controls and the imposition of carbon taxes and cap-and-trade restrictions.
See more on on the NOAA and NASA recipe and how they cook the books here. See the detailed SPPI paper here and E.M. Smith’s post here. See also John Coleman’s KUSI special which covered this topic here and the interview with E.M. Smith here. See some answers to NOAA and NASA rebuttals here (fuller response coming today). Scroll down to see the point-by-point responses.