How independent is this project?
Would BEST have ever seriously published a study showing anything other than a scary warming trend?
This is emblematic of how fans of Climate Change Scares present their efforts with half-truths - lines that are technically “correct” but leave an impression that may be the opposite of the real situation.
Elizabeth Muller is listed as “Founder and Executive Director” of the Berkeley Earth Team along with her father Richard Muller. But since 2008 it appears she’s been earning money as a consultant telling governments how to implement green policies, how to reduce their carbon footprint and how to pick “the right technologies” - presumably meaning the right “Green” technologies.
Mullers Daughter Elizabeth registered “GreenGov” in 2008
Richard and Elizabeth Muller. Image: Paul Sakuma/AP
She registered their website and tried to register the trademark herself.
“GreenGov is a service offered by Muller & Associates for Governments, International Organizations, non profits, and other organizations that work with Government. The aim is to provide politically-neutral counsel that is broad in scope while rooted in the hard facts of state-of-the-art science and engineering. The key is to make the right patch between the best technologies and the strengths of the government. We know that to be effective the political dimension must be integrated into the technical plan from the start.”
Muller and Associates helps investors profit from investments in alternative energy.
From her “speakers profile”:
“GreenGov provides interdisciplinary knowledge that helps clients determine the best technology for their specific need. Elizabeth has designed and implemented projects for public sector clients in the developed and in the developing world, helping them to build new policies and strategies for government reform and modernization, collaboration across government ministries and agencies, and strategies for the information society. She has developed numerous techniques for bringing government actors together to build consensus and implement action plans, and has a proven ability to deliver sustainable change in government.”
“Green can be profitable”
“Making Green ICT a Government priority”
“It’s not just about reducing the Carbon footprint for information and communication technologies - though this is also important. But the real breakthrough for Green ICT will be in helping build consensus among stakeholders, and to bring clarity and transparency to “Green” projects.”
Strangely, Elizabeth forgot to mention this on her Berkeley Biography. She said she has advised governments, but not that the aim of that advice was to reduce their carbon footprint, and to select the right green technology. The current organization she lists on her Bio is called CSTransform which is neutrally vague about its aims, except that it’s obviously feeding off Big-Government, so scientific results that suggested that Governments don’t need to save the world by taxing and charging people would not seem to be her first priority. In her bio on the CSTransform site, it does mention her green desires: Elizabeth Muller is a “leading expert in how governments can use ICT to develop a more sustainable, lower-carbon future.” Evidently she has not had a skeptical conversion anytime in the last four years, but was happy to work with her Dad, which presumably would have been a very non-obvious thing to do if he was a “skeptic” as he claims he was.
Naturally Elizabeth would be delighted to discover that there was little evidence that a low carbon future was beneficial, necessary or even worth promoting and we are sure she would have overseen BEST in an utterly impartial light. /sarc
Muller and Associates repeat that they are “politically neutral” and “non-partisan”, but it’s obvious that they benefit from big-government policies, and the bigger the better. It would hard to imagine them welcoming a political policies aiming for a smaller government. That would rather turn off the tap, eh?
Perhaps most damning of all (in terms of their judgement) is that Richard Muller and Elizabeth Muller thought they could get away with it. Did it really not occur to them that skeptics would not find their alarmist comments and green companies on the world wide web? Did they really think they would escape with their credibility intact?
Bottom line: Of course, none of this personal information tells us anything about the accuracy of the BEST results, or about the global climate, but it does tell us about the accuracy of the message and PR announcements. BEST stress that they are independent and transparent and non-profit, but don’t mention that Elizabeth’s career has profited from findings that support the “climate change scare”. The BEST team are happy for the media to rave about how Richard Muller was “converted” (even though he was never really skeptical) but not too keen to say that Elizabeth has confirmed her strongly held position in spades.
From the BEST FAQ (my bolding)
“Berkeley Earth Surface Temperature aims to contribute to a clearer understanding of global warming based on a more extensive and rigorous analysis of available historical data.”
“We believe that science is nonpartisan and our interest is in getting a clear view of the pace of climate change in order to help policy makers to evaluate and implement an effective response. In choosing team members, we engage people whose primary interests are finding answers to the current issues and addressing the legitimate concerns of the critics on all sides. None of the scientists involved has taken a public political stand on global warming.”
It also tells us something about people who write off skeptical results because they are supposedly “funded by big oil”, but rave about the BEST project. If they are so concerned about “vested interests” why do they only protest about one sort of “interest”.
By Jeremy A. Kaplan
Extent of surface melt over Greenlands ice sheet on July 8 (left) and July 12 (right). Measurements from three satellites showed that in just a few days, the melting had dramatically accelerated and an estimated 97 percent of the ice sheet surface had thawed by July 12. (Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)
NASA’s claim that Greenland is experiencing “unprecedented” melting is nothing but a bunch of hot air, according to scientists who say the country’s ice sheets melt with some regularity.
A heat dome over the icy country melted a whopping 97 percent of Greenland’s ice sheet in mid-July, NASA said, calling it yet more evidence of the effect man is having on the planet.
But the unusual-seeming event had nothing to do with hot air, according to glaciologists. It was actually to be expected.
“Ice cores from Summit station [Greenland’s coldest and highest] show that melting events of this type occur about once every 150 years on average. With the last one happening in 1889, this event is right on time,” said Lora Koenig, a Goddard glaciologist and a member of the research team analyzing the satellite data.
But rather than a regular 150-year planetary cycle, the new NASA report calls the melt “unprecedented,” the result of a recent strong ridge of warm air, or a heat dome, over Greenland—one of a series that has dominated Greenland’s weather since the end of May.
“Each successive ridge has been stronger than the previous one,” said Thomas Mote, a climatologist at the University of Georgia. This latest heat dome started to move over Greenland on July 8, and then parked itself over the ice sheet about three days later. By July 16, it had begun to dissipate, along with the ice, NASA said.
Climate skeptics said the NASA report itself was the only “unprecedented” item.
“NASA should start distributing dictionaries to the authors of its press releases,” joked Patrick J. Michaels, a climatologist and the author of the World Climate Report blog.
“It’s somewhat like the rush to blame severe weather and drought on global warming,” Anthony Watts, a noted climate skeptic and the author of the Watts Up With That blog, told FoxNews.com. “Yet when you look into the past, you find precedence for what is being described today as unprecedented.”
It’s the latest hot water for the National Aeronautics and Space Administration, which critics say has shifted focus and priorities from space and aeronautics to the earth we live on—and the planet’s changing climate.
NASA chief cryospheric scientist H. Jay Zwally told FoxNews.com that the melting has been increasing as the temperatures in Greenland have been increasing.
“Climate in the Arctic has been warming about three to four times more than the global average, and Greenland surface temperatures (observed by satellite and surface instruments) have been increasing about 2 degrees Celsius per decade during about the last 20 years,” he said. (Bullshit! the purported warming of the polar regions comes from a >1500 Km extrapolation of temperature by Mr Hansen. They don’t know what is happening and the present Arctic melting is also a cyclical [~50 y] event that is probably related to ocean circulation. BB)
Zwally would be in a position to know: He was lead scientist for the ICESat project, which ran from 2003 to 2010, and used satellites to measure Antarctic and Greenland ice sheets.
“This is the most extensive area of surface melting during last 40 years of satellite observations,” he said.
It may be in line with the 150-year cycles of melting, however. Mary Albert, executive director of the NSF Ice Core Drilling office, and Kaitlin Keegan, an engineering PhD student and a fellow in Dartmouth’s polar environmental change program, are working on a paper on the Greenland ice sheet melt, a school spokeswoman told FoxNews.com.
Neither was available to describe the exact findings, but in a blog posting detailing her work, Keegan noted that several cores dating back millennia have also reflected the 150-year cycle.
“In Greenland there have been many deep ice-core drilling projects which drilled ice to the bedrock,” she wrote. “In the past 10,000 years (the Holocene), there is on average a melt layer every 150 years.”
NASA ice scientist Tom Wagner told the Associated Press researchers don’t know precisely how much of Greenland’s ice had melted in this latest event, but it seems to be freezing again.
“The belief that almost any aberration in weather and climate today can be attributed to global warming is pure folly,” Watts told FoxNews.com.
David Whitehouse, GWPF
Anyone who has seen the raw temperature output from a weather station must have wondered at the marvel of averages. The output is all over the place - large fluctuations in temperature from hour to hour and day and night. Yet from those measurements the result is just one number - the monthly average - that finds its way into climate data.
Picking meaningful information from the variable set that are weather stations often seems more art than science; truncated sequences, gaps, changes of equipment, changes of sites, changes in the local environment, to name but a few factors that have to be taken into consideration, or sometimes not taken into consideration.
A new analysis of some of the statistical methods used in getting something out of temperature readings from weather stations carried out by Steirou and Koutsoyiannis of the National Technical University of Athens has been gaining some publicity as its conclusions are startling. The researchers say that the statistical manipulation of the data to correct errors often introduces even greater errors, as well as exaggerating positive trends.
Such statistical pitfalls are everywhere when one manipulates data like this. Consider the recent case of Dr Joelle Gergis of the University of Melbourne whose paper on 1000 years of climate data in Australia has had to be withdrawn for rewriting when it was pointed out that the “hockey sticks” produced by the calculations were artifacts. Then there is also the original hockey stick, once the unquestioned (by some) emblem of global warming, which was also shown to be in its broad detail an artifact of data processing.
Considering the processes applied to temperature time series Steirou and Koutsoyiannis say: “It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.”
“In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.”
They give an example [Above] CLICK for bigger image.
“The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4 deg C and 0.7 deg C, where these two values are the estimates derived from raw and adjusted data, respectively.”
If the rise in temperature really is only 0.4 deg C then that changes everything.
Warmer Than Today
Another potentially highly significant paper, this time concerning the Medieval Warm Period (MWP) comes from the Journal Paleogeography, Paleoclimatology and Paleoecology. It is entitled “Marine climatic seasonality during early medieval times (10th to 12th centuries) based on isotopic records in Viking Age shells from Orkney, Scotland.”
In the abstract the authors say; “Seasonal sea-surface temperature (SST) variability during the Medieval Climate Anomaly (MCA), which corresponds to the height of Viking exploration (800-1200 AD), was estimated using oxygen isotope ratios (δ18O) obtained from high-resolution samples micromilled from archaeological shells of the European limpet, Patella vulgata.”
“Our findings illustrate the advantage of targeting SST archives from fast-growing, short-lived molluscs that capture summer and winter seasons simultaneously. Shells from the 10th to 12th centuries (early MCA) were collected from well-stratified horizons, which accumulated in Viking shell and fish middens at Quoygrew on Westray in the archipelago of Orkney, Scotland. Their ages were constrained based on artifacts and radiocarbon dating of bone, charred cereal grain, and the shells used in this study. We used measured δ18OWATER values taken from nearby Rack Wick Bay (average 0.31 plus/minus 0.17‰ VSMOW, n = 11) to estimate SST from δ18OSHELL values. The standard deviation of δ18OWATER values resulted in an error in SST estimates of plus/minus 0.7 C.”
“The coldest winter months recorded in the shells averaged 6.0 plus/minus 0.6 C and the warmest summer months averaged 14.1 plus/minus 0.7 C. Winter and summer SST during the late 20th century (1961-1990) was 7.77 plus/minus 0.40 C and 12.42 plus/minus 0.41 C, respectively.”
“Thus, during the 10th to 12th centuries winters were colder and summers were warmer by ~ 2 C and seasonality was higher relative to the late 20th century. Without the benefit of seasonal resolution, SST averaged from shell time series would be weighted toward the fast-growing summer season, resulting in the conclusion that the early MCA was warmer than the late 20th century by ~ 1 C.”
“This conclusion is broadly true for the summer season, but not true for the winter season. Higher seasonality and cooler winters during early medieval times may result from a weakened North Atlantic Oscillation index.”
Two papers in well-respected, peer-reviewed scientific journals conclude that perhaps the warming observed in the past century has been overestimated, and that the MWP was substantially warmer than today. This is bound to provide food for thought.
To the Right Honorable Jean Charest, Premier of Quebec
July 16th 2012
The World Council for Nature (WCFN), considering that the implantation of wind turbines in the countryside causes collateral damages to the fauna and its habitat, wishes to bring to your attention the fact that investing in windfarms will not bring any positive return to the population of Quebec or to the health of the planet. You already have more clean energy than you need and must sell it often at a loss because your neighbors no longer really need it. It is therefore absurd to invest huge sums of public funds to produce more of it, at a price four times higher (1).
It is no secret that Hydro Quebec finds it increasingly difficult to sell its surpluses, which amount to about 8,700,000,000 kWh a year. It is also paying close to 150 million dollars each year to the thermal power station of Becancour, to produce NO electricity. This amounts to $900 million paid to date, for nothing (2). What a miscalculation! What a waste!
Yet you are presently investing considerable amounts of public money into more generating capacity, imposing to Quebeckers windfarms that they don’t need. It is simply nonsensical. Windfarms won’t make Quebec “greener”, on the contrary. It is already ahead of the world on that score, covering as it does its electricity needs with hydro power, which is “clean”.
What you are actually doing is replacing a clean and cheap energy which has no ill effects on the ealth of neighbors or on birds and bats (hydro), by another which causes multiple collateral damages and costs four times as much (windfarms) (1).
Windfarms destroy landscapes and kill millions of birds and very useful bats (3) for no gain whatsoever. They emit infrasound that cause sleep deprivation to neighbors, up to 10 km away for the larger models (e.g. 3 MW). Coming on top of plentiful hydro they duplicate the ecological impacts and financial investments, and this to satisfy the same demand for electricity. It is the opposite of ecological, and amounts to throwing away billions of dollars of public funds as wind energy is heavily subsidized.
Windfarms in Quebec are a redundant investment, completely useless because you already have too much energy, clean and cheap to boot. They are also redundant in the rest of the world, but in the case of Quebec it is plain as the nose in the face.
I live in Spain, a country which is technically bankrupt in part because of subsidies to renewable energies, which have increased the country’s sovereign debt by some 30 billion dollars to date, plus 8 billion more each year because the subsidies are guaranteed for 20 years. We produce the same amount of electricity as before, but it costs much more. As for our 18,000 wind turbines, they haven’t even reduced our consumption of fossil fuels, because of problems caused by their intermittency (4).
Have you considered the loss of value of properties that are or will be affected by the sight of these industrial installations, and by the infrasound they emit, which cause sleep deprivation? We are talking about cumulative losses in the billions of dollars, which will impoverish Quebec as a whole, in addition to rising electricity bills and the loss of tourism potential.
Last but not least, economists have shown that large subsidies and high energy prices both contribute to the destruction of jobs across the economy, whereas wind farms create very few permanent ones (5). Spain is a good example, with 25% unemployment.
Please, Mr Charest, do not destroy la Belle Province, its nature, and its future.
Tel : +34 693 643 736
(1) The wind-power rates are more than twice as high as nuclear, and four times those of hydro.
(2) Globe and Mail: ”Did Hydro Quebec miscalculate? “
$900 million wasted with Becancour, and a surplus of 8.7 TW
(3) Wind farms kill millions of birds and bats a year:
AUDITOR GENERAL of ONTARIO:
Green jobs kill other jobs in the economy
GREEN JOBS ARE A BURDEN, NOT A BENEFIT FOR THE ECONOMY
The Beacon Hill Institute, 25 June 2009
Schneider et al 2012 in a poster presentation to the two-day, “Taking the temperature of the Earth Conference,” that ends today, has the clever idea of looking at the temperatures of lakes and reservoirs around the world. They point out that in situ observations of lake surface temperatures are very rare on a global scale, but infrared imagery from space can be used to infer water surface temperatures of lakes and reservoirs.
They provide data for 169 of the largest inland water bodies world- wide using three satellite-borne instruments. Together they provide daily to near-daily data from 1981 through to the present, allowing them to calculate 25-year trends of nighttime summertime/dry-season surface temperature.
They find that the surface temperatures of the water bodies have been “rapidly warming” with an average rate of 0.350 plus/minus 0.11 deg C per decade for the period 1985–2010.
Two years ago Schneider et al published what was then described as the first global survey of lake temperatures. Then the researchers found a decadal trend of 0.45 deg C.
The researchers say the results provide a critical new independent data source on climate change that indicates lake warming in certain regions is greater than expected based on air temperature data.
Their graph of temperature anomaly (click to enlarge) looks very familiar to anyone who is knows the global temperature datasets over the past thirty years. However, I don’t think their regression line is a good description of the data. My preliminary calculations suggest that there is no statistically significant trend post-1997. Hence an alternate description of their findings is that the world’s large bodies of water show the well known standstill of the past decade or so seen in global temperatures.
Note: While it is possible to draw a linear regression line between 1997 and 2011 (you can draw a trendline through almost anything) that yeilds 0.1 deg per decade it is statistically meaningless given the large variance of the data. The error on the trendline is several times its magnitude, and it is highly sensitive to moving the start and endpoint by a year or two. Conclusion: No statistically significant trend post-1997. Since 1997 the data is best represented by a straight line of mean 0.21 deg with a large standard deviation of 0.95 deg. Below is the post-1997 portion of the researcher’s graph. It is easy to see that the trendline calculated from the 1985-2011 data does not fit this section of the data in which there is no trend. Click here to enlarge.
Plotted without any extra information, here is the post-1997 data. Click here to enlarge.
By Peter Foster, Financial Post
Maurice Strong sees the cratering of his Stewie Griffin-style plan to rule the world
The “failure” of Rio+20 is a cause for celebration, even if you can’t afford the champagne and foie gras that ecocrats served themselves as their hopes for “Sustainia” retreated into the policy fog. A mostly “B” list of government leaders (No Barack Obama. No David Cameron. No Stephen Harper. No Angela Merkel) was set to adopt a pablum-filled 283-point “vision” on Friday that was finalized before they arrived.
“[N]othing less than a disaster for the planet,” declared Nnimmo Bassey, Nigerian poet and chair of Friends of the Earth International. “[A]n epic failure,” claimed Kumi Naidoo, Greenpeace International executive director. “[A] colossal waste of time,” chimed in Jim Leape, international director-general of World Wildlife Fund.
An umbrella group of NGOs bemoaned the official text’s lack of mention of “planetary boundaries, tipping points or planetary carrying capacity,” the very shibboleth’s of radical environmentalism’s zero-sum thinking.
Significantly, the mother and father of sustainable development, Gro Harlem Brundtland and Maurice “Chairman Mo” Strong, carped - or should that be gro-aned and mo-aned - from the Rio sidelines. Ms. Brundtland was the figurehead of the 1987 Brundtland report, which spilled sustainable development all over the policy map, while Mr. Strong orchestrated the 1992 Rio conference, which the current 50,000-strong flop is intended to commemorate.
According to Ms. Brundtland, Rio+20’s failure is due to the eurozone crisis and the power of Tea Party climate deniers.
Mr. Strong was flown in from China at UN (that is, taxpayers’ expense to be regaled by a group of corporations on Monday as a “very special guest of honour.” Mr. Strong is less than happy at the cratering of his Stewie Griffin-style master plan to rule the world, which has always clashed rather alarmingly with his problems in steering small companies, not to mention his implication in the UN/Iraqi oil-for-food scandal.
One wonders if these aged eco-doomsters were embarrassed by support from Iranian President Mahmoud Ahmadinejad, who called for rich countries to eschew “materialist” desires and pursue “spiritual” development. Mr. Ahmadinejad also suggested that: “The collapse of the current atheistic order is reaching its time.”
Perhaps so - the social democratic replacement for God is certainly proving to have feet of clay in Europe - but it looks more than doubtful that Gaia’s green caliphate will be taking over, even if the iconic statue of Christ the Redeemer, which looks down on Rio, was illuminated with green light for the conference.
The high priests of the new green world order crave cash, but calls for humanity to fork over for Gaia’s “services” are falling on deaf ears, and not just because of the global economy. One problem is that Gaia has no bank account. UN Secretary-General Ban Ki-Moon, while ritually bemoaning the weakness of Rio+20’s outcome, declared this week that “Nature does not negotiate with human beings.” But then neither does she speak through a green self-elect. Gaia’s service fees would wind up in the coffers of the guys and gals who brought you not just oil-for-food, but a human rights system ruled by the world’s worst rights abusers, utterly corrupted climate science and peace in Syria.
The failure of Rio does not mean disregard for “The Environment.” Environmental protection is a branch of human protection. The environment has no value except for what it means to humans. The outrage that this observation will promote serves to prove the point. The environment can no more value itself than it can express outrage. Human development inevitably involves disturbance of land and potential pollution of air and water. The issue is never people versus the environment. It is the interests of some people vs. the interests of others. The question is one of balance, and that pollution should not be suffered without compensation. A bigger question is one of entirely bogus eco scares being manufactured as a rationale for payoffs to the very kleptocrats who are responsible for global poverty.
Canada should be justly proud of being in the vanguard of this return to balance both via its withdrawal from Kyoto and the environmental provisions of Bill C-38, which do not seek to trash safeguards - as alarmists have suggested - but to eliminate duplication, bureaucratic overreach, and the potential for sheer obstructionism.
Naturally, the threat of sustainable ideology is not over. Too many bureaucrats at the UN and national level are invested in it. Too much NGO fund raising relies on it.
Significantly, the official text talks of working with NGOs, despite their lack of political legitimacy. The text also still calls for more power for the United Nations Environment Program (UNEP). At least there is no mention of a World Environmental Organization, which would have been just as useless but would have threatened endless further negotiations on purpose, membership, funding, etc. etc.
There remain calls to tie down a set of Sustainable Development Goals, which should be good for another hundred reports and a dozen conferences. An Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) is also on the drawing board. This will reportedly do for biodiversity what the Intergovernmental Panel on Climate Change (IPCC) did for climate science: pervert it for political ends.
The Rio+20 text was originally sold as promoting “The Future We Want.” However, the “We” in question was always a self-selected group of UN bureaucrats, alarmist NGOs, corporate rent-seekers and main chancers whose interests were sharply at odds with those of ordinary people. Rio+20’s failure should be celebrated as The Future We Avoided.
Now it is time for us to celebrate and thumb our noses at Maurice and the phoney extremist NGOs who fawn care about the environment but for whom it is all about power and control over our lives and what energy we use, what foods we eat, etc.
Dr. Ross McKitrick
When policymakers think about climate change, it’s about forecasting changes where people actually live - in local regions - rather than potential global impacts. But how good are climate models at predicting regional climate patterns? Useful climate models should not only get global trends right, but also regional patterns of change. Recent research has identified that, with few exceptions, climate models not only fail to do better than random guesses, in many cases they are actually worse.
Statistical analysis using available socioeconomic data has been shown to provide more successful explanations of regional climate change than the supercomputer climate modelling systems (General Circulation Models or GCMs), which underpin Intergovernmental Panel on Climate Change (IPCC) claims about “greenhouse gas” warming and forecasts of future warming. The IPCC climate calculations all assume that “greenhouse gases” play the dominant role in climate change.
Professor Ross McKitrick (University of Guelph, Canada and an IPCC expert reviewer) and Lise Tole of Strathclyde University evaluated a range of socioeconomic data against the 22 available IPCC GCMs to find which approach best explains the regional pattern of recent temperature trends around the world. They concluded that you need both, especially socioeconomic data.
Their much simpler rival statistical model had nothing to do with “greenhouse gases” but explained regional warming patterns due to urbanisation, socioeconomic and industrial development much better than the GCMs. While both might be partly right, the polarised IPCC assumptions that socioeconomic patterns have no climate impact and that temperature changes must be due to carbon dioxide emissions have never been proven to be factual.
Their research identified that 10 of the 22 GCM models predicted climate patterns opposite to observed records. The next ten GCM models showed improved predictions, but were not better than random guesses. Only two GCMs showed significant evidence of explanatory power.
They studied whether each of the 22 GCMs does such a good job explaining climate change data that the socioeconomic data could be ignored, or vice versa. In all 22 cases there was no chance of ignoring the socioeconomic data. On the other hand, only three of the 22 GCMs provided useful outputs but with one giving results opposite to the observed patterns. All 22 cases confirmed the importance of including the urbanisation and industrial development measures which IPCC claims are irrelevant.
The researchers then evaluated combinations of both types of models which again confirmed that socioeconomic data were essential explanatory factors. From over 500 million combinations, they identified that three of the seven regional socioeconomic variables and three of the 22 GCMs were relevant for a valid model of temperature change patterns over the Earth’s surface. The three useful climate models were from China, Russia and the US NCAR while GCM models from the main US government sources (NASA and NOAA), Australia, Canada, France, Germany, Japan, Norway and the U.K., could be ignored as they failed to exhibit any explanatory power for the regional pattern of surface temperature trends in any test, alone or in any combination.
Two important conclusions arise from their study. The significant explanatory power of their socioeconomic factors confirms that the temperature records are seriously ‘contaminated’ by urban and industrial development impacts, contrary to the 2010 U.K. Muir Russell Climategate Inquiry finding.
Second, since regional climate predictions made by contemporary climate models proved basically worthless, government policies based on those predictions will be equally useless.
Dr. Anthony Lupo
As we enter summer, the narrative is that the January to May period nationally has been the warmest on record, approximately 5.0 degrees Fahrenheit above the long term average . This came on the heels of an incredible month of March which was 8.6 degrees Fahrenheit above the long term average and the warmest since 1895 . The days were warm and spring came early. But we also heard about how this was not really a good thing, that this March was somehow unnatural, like “the weather equivalent of baseball player on steroids” . The implication is that humans are the driving force behind the warmth.
In the Missouri region, March was about 15.2 degrees above the 1981-2010 normal. The year-to-date is running about 7.4 degrees Fahrenheit above the same benchmark for January to May. In fact, this year-to-date is running some 2.8 degrees ahead of the next closest January to May which was way back in 1921! March itself shattered the previous record for that month beating out March 1946, the previous standard, by a whopping 3.7 degrees! How unprecedented is such an occurrence?
While they don’t happen often, monthly temperature anomalies of 10 - 15 degrees above normal do happen here in the middle of the continent where weather extremes are typically greater in magnitude. Remarkable cold anomalies have occurred as well historically, for example, our region experienced temperatures that were 16.0, 15.8, and 15.2 degrees below the normal during January 1977, March 1960, and December 1983, respectively.
Then, was March 2012 really a new standard among warm anomalies? In our region, it was not. One can go all the way back to 1889 and find that December was an incredible 16.8 degrees above the latest climate normal for 1981-2010 (which was the same as 1891-1920 for December). During that month, only seven nights experienced temperatures below freezing and only two below 20 degrees. No day time highs were colder than freezing and only the last two days of the month averaged below freezing. A check of daily weather maps , such as they were in those days, reveals that the warmth was also widespread in a manner similar to that of March 2012. During that Holiday season, Jack Frost took a lot of time off from nipping at noses!
The year 1889 is not a period one typically associates with human induced global warming. Also, warm (or cold) anomalies such as this do not typically occur in isolation. One can then study the periods leading up to and following such an event, and a comparison of March 2012 with the December 1889 period can be done.
We should note that both months occurred during a year that could be classified as weakly La Nina (although the jury may be out on 2011 – 2012 whether it meets the exact criterion) . The temperatures were cooler than normal leading up to December 1889. In the following nine months, however, six were warmer than normal, four including the two immediately following were well above the normal. March 2012 was preceded by three straight mild winter months, and was followed by a normal April. May returned to warmer conditions again, and the first half of June has been only marginally warmer than normal in this region.
Examining the precipitation leading up to each month including that month, the 1889 period was drier than normal as the precipitation was only about 75% of normal regionally. The March 2012 period was wetter than normal, but given the warm temperatures, there was likely more evaporation than normal. Dew point records are not available for the earlier period, and are not yet entirely available for the December 2011 - March 2012 period. It can safely be assumed that both periods were a bit dry, which would help create the favorable conditions for warmer temperatures in each month.
Further a comparative study of the atmosphere for these two periods is currently underway, and it can be seen from looking at the monthly composite upper air maps for both periods that the coldest air was bottled up to the north in the Arctic, while the eastern two-thirds of the nation were under a fairly strong ridging event (Fig. 1). This means that both periods experienced more sunshine during the month, and overall similar mechanisms may have been at work in producing December 1889 and March 2012, and both may have been the result of a similar confluence of atmospheric and oceanic forcing over North America.
And while North America has been warm from late 2011 and into 2012, similar circumstances are not happening globally. Will such a warm event occur again soon over North America? Given the historical record, it is not very likely. But the bottom line is that the occurrence of these events has a readily explainable cause that does not invoke excess carbon dioxide.
Figure 1. The 500 hPa maps for December 1889 (top) and March 2012 (bottom). The map for December 1889 was derived with much less data, and uses a slightly different map projection.
 State of the Climate National Overview.
 Borenstein, S., 2012: Start of 2012, March shatter US heat records. For the Associated Press, for example see:
 Daily Weather Maps.
 Center for Ocean and Atmospheric Prediction Studies.