Frozen in Time
Apr 14, 2013
Data versus dogma and character assassination

By Dr. Don Easterbrook

An interesting contrast in the manner of debating climate issues arose as a result of my testimony before a Washington State Senate hearing March 16, 2013.  The hearing concerned a senate bill based on five very badly flawed assertions, and I was invited to present scientific evidence related to the basis of the bill. I began my testimony with the famous quote “In God we trust, all others bring data” and then presented graphical data that can be viewed here.

The five assertions that formed the basis for the bill were:

1. Emissions of greenhouse gases from human activities is the principal cause of global warming.

2. Sea level is rising at an increasing rate because of global warming.

3. The frequency of severe storms is increasing because of global warming.

4. Mountain winter snow packs and streamflows are diminishing because of global warming.

5. Ocean acidification is occurring because of global warming.

The graphic data and physical evidence for rejecting these assertions may be found in the video above.

Almost immediately upon completion of my testimony, the chairman of the geology dept. at Western Washington University, Bernie Housen, who has never published a single paper on climate and who knows nothing about any of the climate topics I discussed, issued a statement to the AP wire service that I was neither an expert in my field nor active in my field.  The following Sunday, March 31, 12 members of the geology department at WWU, none of whom have ever published a single paper on climate and who have no climate expertise, published a vicious personal character assassination against me in the Bellingham Herald.

They claimed that (1) my work is “filled with misrepresentations, misuse of data,” (2) every graph I showed was flawed, (3) none of my 180 publications had been peer reviewed, (4) my evidence was “not supported by any published science,” (5) my views “require the existence of a broad, decades-long conspiracy amongst literally thousands of scientists to falsify data, and (6) they “decry the injection of such poor quality science into the public discourse.”

Needless to say, they didn’t address any of the issues that I discussed in the Senate hearing.  You can read their op-ed online at the Bellingham Herald website

Here is my response, published in the Bellingham Herald:

“WWU faculty find overwhelming scientific evidence to support global warming.” Of course there is overwhelming evidence of global warming! Everyone agrees!

But that doesn’t prove it was caused by carbon dioxide! The authors fail to understand:

(1) Of the two periods of global warming this century, the first, and warmest, occurred before rise in carbon dioxide; (2) Twenty periods of global warming occurred over the past five centuries; (3) The past 10,000 years were warmer than present; and (4) Multiple periods of intense warming (20 times more intense than recent warming) occurred 10,000-to-15,000 years ago. All of these happened long before rise in carbon dioxide, so could not possibly have been caused by carbon dioxide.

The Bellingham Herald opinion column is a diatribe against me personally (just read the slurs and innuendos) containing misrepresentations, no real data to support their contentions, and displays an abysmal ignorance of published literature. The reason becomes apparent when you realize that not a single one of the 13 Western Washington University authors has ever published a single paper on global climate change and none have any expertise whatsoever in climate issues.

Their claim that my publications “have not passed through rigorous peer review” is false. Virtually all of my 180 publications were peer-reviewed. The real joke here is they “fully support the 2007 IPCC report,” but Donna Laframboise in 2011 documented that 30 percent of the references used were not peer-reviewed, so using their own standard, they would be forced to reject the 2007 Intergovernmental Panel on Climate Change report!

The authors claim that “CO2 is a powerful greenhouse gas” that has “significant and measureable impact on surface temperature.” Carbon dioxide is a greenhouse gas, but it has little impact on temperature because it makes up only 0.038 percent of the atmosphere, has changed only 0.008 percent since carbon dioxide rose after 1945 (if you double nothing, you still have nothing), and accounts for only 3.6 percent of greenhouse warming. Carbon dioxide is incapable of changing global temperature by more than a fraction of a degree.

The authors “decry the injection of such poor quality science into the public discourse.” I work with 20 of the world’s top scientists, including atmospheric physicists, astrophysicists, geologists, and marine geophysicists who wouldn’t waste time working with me if my research was “of poor quality.”

The authors claim that my work requires “broad, decades-long conspiracy...to falsify climate data.”

In 1999, NASA data showed the 1930s were the hottest decade of the century and 1936 the hottest year. In 2012, NASA subtracted temperatures from the 1930s data and added to recent temperatures to claim that recent years were “unprecedented and the warmest ever recorded.” Check NASA data tampering here. This lies behind all of the false claims that recent global warming is “unprecedented.”

The authors claim a “vast consensus of the science community.” However, 31,487 U.S. scientists (including 9,000 with doctorates) with degrees in atmospheric, Earth sciences, physics, chemistry, biology and computer science have signed a statement that reads: “There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gases is causing, or will in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate.” Check names and expertise here. Signatures of 1.5 million scientists would be required to achieve the claimed “vast consensus” of scientists!

The Intergovernmental Panel on Climate Change chairman admitted that 80 percent of the people involved in the panel were not even scientists!

The WWU faculty was challenged to debate the issues. The response from David Hirsh was: “I don’t want the media to present both sides of an issue.” “Well, the problem is it’s not ‘my’ science. I do not now, nor have I claimed to be an expert in climate science. The question was would I support a debate-type forum to be hosted at WWU? I would not.” He went on to say that he didn’t want to debate because he had not addressed any of the scientific issues, but supported the personal attack.

So what can we conclude about The Bellingham Herald opinion column? Perhaps more than anything it shows that amateurs with no expertise in climate issues are way out of their league and would be wiser to stick to their own areas of expertise, hard rock geology. In the end, nature will tell us who is right and that is happening right now as the climate continues to cool with no warming in 15 years.

ABOUT THE AUTHOR
Don Easterbrook, a professor emeritus of geology at Western Washington University. For more information about him, go online to myweb.wwu.edu/dbunny.

Apr 11, 2013
NOAA Scientist Rejects Global Warming Link to Tornadoes

By James Rosen

A top official at the National Oceanic and Atmospheric Administration (NOAA) rejected claims by environmental activists that the outbreak of tornadoes ravaging the American South is related to climate change brought on by global warming.

Greg Carbin, the warning coordination meteorologist at NOAA’s Storm Prediction Center in Norman, Oklahoma, said warming trends do create more of the fuel that tornadoes require, such as moisture, but that they also deprive tornadoes of another essential ingredient: wind shear.

“We know we have a warming going on,” Carbin told Fox News in an interview Thursday, but added: “There really is no scientific consensus or connection [between global warming and tornadic activity....Jumping from a large-scale event like global warming to relatively small-scale events like tornadoes is a huge leap across a variety of scales.”

Asked if climate change should be “acquitted” in a jury trial where it stood charged with responsibility for tornadoes, Carbin replied: “I would say that is the right verdict, yes.+ Because there is no direct connection as yet established between the two? “That’s correct,” Carbin replied.

Formerly the lead forecaster for NOAA’s Storm Prediction Center, Carbin is a member of numerous relevant professional societies, including the National Weather Association, the American Meteorological Society, the Union of Concerned Scientists, and the International Association of Emergency Managers. He has also served on the peer review committee for the evaluation of scientific papers submitted to publications like National Weather Digest and Weather and Forecasting.

This evaluation by a top NOAA official contradicted pronouncements by some leading global warming activists, who were swift to link 2011’s carnage to man-made climate change.

“The earth is warming. Carbon emissions are increasing,” said Sarene Marshall, Managing Director for The Nature Conservancy’s Global Climate Change Advocacy Team. “And they both are connected to the increased intensity and severity of storms that we both are witnessing today, and are going to see more of in the coming decades.:”

Bjorn Lomborg of the Copenhagen Consensus Center, an activist and author who believes industrialized societies expend too much money and energy combating global warming, instead of focusing on more immediate, and easily rectifiable, problems, doubted the tornadoes have any link to warming trends.

“We’ve seen a declining level of the severe tornadoes over the last half century in the U.S.,” Lomborg told Fox News."So we need to be very careful not just to jump to the conclusion and say, “Oh, then it’s because of global warming.“‘

In fact, NOAA statistics show that the last 60 years have seen a dramatic increase in the reporting of weak tornadoes, but no change in the number of severe to violent ones.

image
Enlarged

For many, the high casualties of 2011 recalled the so-called “Super Outbreak” of April 1974, which killed more than 300 people. “You have to go back to 1974 to even see a tornado outbreak that approaches what we saw yesterday,” W. Craig Fugate, administrator of the Federal Emergency Management Agency (FEMA), told Fox News.

Asked earlier, during a conference call with Alabama Gov. Robert Bentley about the possibility that climate change is playing a role in the tornado outbreak, Fugate shot back: “Actually, what we’re seeing is springtime. Unfortunately, many people think of the Oklahoma tornado alley and forget that the Southeast U.S. actually has a history of longer and more powerful tornadoes that stay on the ground longer—and we are seeing that, obviously, in 2011”

The activity this week with 21 tornadoes was caused like in 2011 by very strong contrast with unseasonable cold and snow north and warmth south. This season is unlikely to be as severe as 2011 when we were coming off the second strongest La Nina with a powerful La Nina jet stream. It will be more active than 2012, a very warm spring with little contrast and very few tornadoes.

image
Enlarged

Apr 05, 2013
Matt Ridleys diary: My undiscovered island, and the Met Office’s computer problem

Matt Ridley, the Spectator

We’ve discovered that we own an island. But dreams of independence and tax-havenry evaporate when we try to picnic there on Easter Sunday: we watch it submerge slowly beneath the incoming tide. It’s a barnacle-encrusted rock, about the size of a tennis court, just off the beach at Cambois, north of Blyth, which for some reason ended up belonging to my ancestor rather than the Crown. Now there’s a plan for a subsidy-fired biomass power station nearby that will burn wood (and money) while pretending to save the planet. The outlet pipes will go under our rock and we are due modest compensation. As usual, it’s us landowners who benefit from renewable energy while working people bear the cost: up the coast are the chimneys of the country’s largest aluminium smelter killed, along with hundreds of jobs, by the government’s unilateral carbon-floor price in force from this week.

image
Weatherbell.com Year-to-Date Anomaly

image

image
FSU’s D.r Bob Hart Daily Snow Anomalies for the Northern Hemisphere

There were dead puffins on the beach, as there have been all along the east coast. This cold spring has hit them hard. Some puffin colonies have been doing badly in recent years, after booming in the 1990s, but contrary to the predictions of global warming, it’s not the more southerly colonies that have suffered most. The same is true of guillemots, kittiwakes and sandwich terns: northern colonies are declining.

It’s not just here that the cold has been relentless. Germany’s average temperature for March was below zero. Norwegian farmers cannot plant vegetables because the ground’s frozen three feet down. In America snow fell as far south as Oklahoma last week. It’s horrible for farmers. But in past centuries, bad weather like that of the past 12 months would kill. In the 1690s, two million French people starved because of bad harvests. I’ve never understood why people argue that globalisation makes for a more fragile system: the opposite is the case. Harvest failures can be regional, but never global, so world trade ensures that we have the insurance policy of access to somebody else’s bumper harvest.

Gloriously, the poor old Met Office got it wrong yet again. In December it said: ‘For February and March… above-average UK-mean temperatures become more likely.’ This time last year it said the forecast ‘slightly favours drier-than-average conditions for April May June, and slightly favors April being the driest of the three months’ before the wettest of all Aprils. The Met Office does a great job of short-term forecasting, but the people who do that job must be fed up with the reputational damage from a computer that’s been taught to believe in rapid global warming. In September 2008 it foretold a ‘milder than average’ winter, before the coldest winter in a decade. The next year it said ‘the trend to milder and wetter winters is expected to continue’ before the coldest winter for 30 years. The next year it saw a ‘60 per cent to 80 per cent chance of warmer-than-average temperatures this winter’ before the coldest December since records began.  ICECAP NOTE: Pierre Gosselin has compiled 57 failed winter forecasts by warmists among hundreds.

At least somebody’s happy about the cold. Gary Lydiate runs one of Northumberland’s export success stories, Kilfrost, which manufactures 60 per cent of Europe’s and a big chunk of the world’s aircraft de-icing fluid, so he puts his money where his mouth is, deciding how much fluid to send to various airports each winter. Back in January, when I bumped into him in a restaurant, he was beaming: ‘Joe says this cold weather’s going to last three months,’ he said. Joe is Joe Bastardi, a private weather forecaster, who does not let global warming cloud his judgment. Based on jetstreams, El Ninos and ocean oscillations, Bastardi said the winter of 2011/12 would be cold only in eastern Europe, which it was, but the winter of 2012/13 would be cold in western Europe too, which it was. He’s now predicting ‘warming by mid month’ of April for the UK.

David Rose of the Mail on Sunday was vilified for saying that there’s been no global warming for about 16 years, but even the head of the Intergovernmental Panel on Climate Change now admits he’s right. Rose is also excoriated for drawing attention to papers which find that climate sensitivity to carbon dioxide is much lower than thought - as was I when I made the same point in the Wall Street Journal. Yet even the Economist has now conceded this. Tip your hat to Patrick Michaels, then of the University of Virginia, who together with three colleagues published a carefully argued estimate of climate sensitivity in 2002. For having the temerity to say they thought ‘21st-century warming will be modest’, Michaels was ostracised. A campaign began behind the scenes to fire the editor of the journal that published the paper, Chris de Freitas. Yet Michaels’s central estimate of climate sensitivity agrees well with recent studies. Scientists can behave remarkably like priests at times.

Joe Bastardi, Ryan Maue and I work together at Weatherbell Analytics. Weatherbell also predicted the turn to colder in the US this late winter and has hit every major snowstorm, tornado outbreak, and hurricane landfall since its inception in 2011. We offer a blog service for enthusiasts as well as specialized services to commercial markets internationally like Killfrost and many of the better more successful and forward thinking hedge (energy and ag) funds, snow clients, insurance and other weather sensitive industries. Ryan has developed a very impressive value added model and data section available to all premium and commercial clients. Go to weatherbell.com.

Apr 03, 2013
We’re not screwed?

Ross McKitrick, Special to the Financial Post

image
Enlarged

11,000-year study’s 20th-century claim is groundless

On March 8, a paper appeared in the prestigious journal Science under the title A reconstruction of regional and global temperature for the past 11,300 years. Temperature reconstructions are nothing new, but papers claiming to be able to go back so far in time are rare, especially ones that promise global and regional coverage.

The new study, by Shaun Marcott, Jeremy Shakun, Peter Clark and Alan Mix, was based on an analysis of 73 long-term proxies, and offered a few interesting results: one familiar (and unremarkable), one odd but probably unimportant, and one new and stunning. The latter was an apparent discovery that 20th-century warming was a wild departure from anything seen in over 11,000 years. News of this finding flew around the world and the authors suddenly became the latest in a long line of celebrity climate scientists.

The trouble is, as they quietly admitted over the weekend, their new and stunning claim is groundless. The real story is only just emerging, and it isn’t pretty.

The unremarkable finding of the Marcott et al. paper was that the Earth’s climate history since the end of the last ice age looks roughly like an upside down-U shape, starting cold, warming up for a few thousand years, staying warm through the mid-Holocene (6,000 to 9,000 years ago), then cooling steadily over the past five millennia to the present. This pattern has previously been found in studies using ground boreholes, ice cores and other very long-term records, and was shown in the first IPCC report back in 1990. Some studies suggest it was, on average, half a degree warmer than the present, while others have put it at one or even two degrees warmer. A lot of assumptions have to be made to calibrate long-term proxy measures to degrees Celsius, so it is not surprising that the scale of the temperature axis is uncertain.

Another familiar feature of long-term reconstructions is that the downward-sloping portion has a few large deviations on it. Many show a long, intense warm interval during Roman times 2,000 years ago, and another warm interval during the medieval era, a thousand years ago. They also show a cold episode called the Little Ice Age ending in the early 1800s, followed by the modern warming. But the Marcott et al. graph didn’t have these wiggles, instead it showed only a modest mid-Holocene warming and a smooth decline to the late 1800s. This was odd, but probably unimportant, since they also acknowledged using so-called “low frequency” proxies that do not pick up fluctuations on time scales shorter than 300 years. The differences between the scale of their graph and that of others could probably be chalked up to different methods.

The new, and startling, feature of the Marcott graph was at the very end: Their data showed a remarkable uptick that implied that, during the 20th century, our climate swung from nearly the coldest conditions over the past 11,500 years to nearly the warmest. Specifically, their analysis showed that in under 100 years we’ve had more warming than previously took thousands of years to occur, in the process undoing 5,000 years’ worth of cooling.

This uptick became the focus of considerable excitement, as well as scrutiny. One of the first questions was how it was derived. Marcott had finished his PhD thesis at Oregon State University in 2011 and his dissertation is online. The Science paper is derived from the fourth chapter, which uses the same 73 proxy records and seemingly identical methods. But there is no uptick in that chart, nor does the abstract to his thesis mention such a finding.

Stephen McIntyre of climateaudit.org began examining the details of the Marcott et al. work, and by March 16 he had made a remarkable discovery. The 73 proxies were all collected by previous researchers, of which 31 are derived from alkenones, an organic compound produced by phytoplankton that settles in layers on ocean floors, and has chemical properties that correlate to temperature. When a core is drilled out, the layers need to be dated. If done accurately, the researcher could then interpret the alkenone layer at, say, 50 cm below the surface, to imply (for example) the ocean temperature averaged 0.1 degrees above normal over several centuries about 1,200 years ago. The tops of cores represent the data closest in time to the present, but this layer is often disturbed by the drilling process. So the original researchers take care to date the core-top to where the information begins to become useful.

According to the scientists who originally published the alkenone series, the core tops varied in age from nearly the present to over a thousand years ago. Fewer than 10 of the original proxies had values for the 20th century. Had Marcott et al. used the end dates as calculated by the specialists who compiled the original data, there would have been no 20th-century uptick in their graph, as indeed was the case in Marcott’s PhD thesis. But Marcott et al. redated a number of core tops, changing the mix of proxies that contribute to the closing value, and this created the uptick at the end of their graph. Far from being a feature of the proxy data, it was an artifact of arbitrarily redating the underlying cores.

Worse, the article did not disclose this step. In their online supplementary information the authors said they had assumed the core tops were dated to the present “unless otherwise noted in the original publication.” In other words, they claimed to be relying on the original dating, even while they had redated the cores in a way that strongly influenced their results.

Meanwhile, in a private email to McIntyre, Marcott made a surprising statement. In the paper, they had reported doing an alternate analysis of their proxy data that yielded a much smaller 20th century uptick, but they said the difference was “probably not robust,” which implied that the uptick was insensitive to changes in methodology, and was therefore reliable. But in his email to McIntyre, Marcott said the reconstruction itself is not robust in the 20th century: a very different thing. When this became public, the Marcott team promised to clear matters up with an online FAQ.

It finally appeared over the weekend, and contains a remarkable admission: “[The] 20th-century portion of our paleotemperature stack is not statistically robust, cannot be considered representative of global temperature changes, and therefore is not the basis of any of our conclusions.”

Now you tell us! The 20th-century uptick was the focus of worldwide media attention, during which the authors made very strong claims about the implications of their findings regarding 20th-century warming. Yet at no point did they mention the fact that the 20th century portion of their proxy reconstruction is garbage.

The authors now defend their original claims by saying that if you graft a 20th-century thermometer record onto the end of their proxy chart, it exhibits an upward trend much larger in scale than that observed in any 100-year interval in their graph, supporting their original claims. But you can’t just graft two completely different temperature series together and draw a conclusion from the fact that they look different.

The modern record is sampled continuously and as a result is able to register short-term trends and variability. The proxy model, by the authors’ own admission, is heavily smoothed and does not pick up fluctuations below a time scale of several centuries. So the relative smoothness in earlier portions of their graph is not proof that variability never occurred before. If it had, their method would likely not have spotted it.

What made their original conclusion about the exceptional nature of 20th-century warming plausible was precisely the fact that it appeared to be picked up both by modern thermometers and by their proxy data. But that was an illusion. It was introduced into their proxy reconstruction as an artifact of arbitrarily redating the end points of a few proxy records.

In recent years there have been a number of cases in which high-profile papers from climate scientists turned out, on close inspection, to rely on unseemly tricks, fudges and/or misleading analyses. After they get uncovered in the blogosphere, the academic community rushes to circle the wagons and denounce any criticism as “denialism.” There’s denialism going on all right - on the part of scientists who don’t see that their continuing defence of these kinds of practices exacts a toll on the public credibility of their field.

Financial Post

Ross McKitrick is professor of economics and CME fellow in sustainable commerce at the Department of Economics, University of Guelph.

Note: The original Marcotte findings were in line with the ice core data from Greenland (Richard Alley - no skeptic)

image
Enlarged

Mar 30, 2013
New study finds increased CO2 ameliorates effect of drought on crops

Hockey Schtick

Elevated Carbon Dioxide in Atmosphere Trims Wheat, Sorghum Moisture Needs

Mar. 25, 2013 - Plenty has been written about concerns over elevated levels of carbon dioxide in Earth’s atmosphere, but a Kansas State University researcher has found an upside to the higher CO2 levels. And it’s been particularly relevant in light of drought that overspread the area in recent months.

“Our experiments have shown that the elevated carbon dioxide that we now have is mitigating the effect that drought has on winter wheat and sorghum and allowing more efficient use of water,” said K-State agronomy professor Mary Beth Kirkham.

Kirkham, who has written a book on the subject, “Elevated Carbon Dioxide: Impacts on Soil and Plant Water Relations,” used data going back to 1958. That’s when the first accurate measurements of atmospheric carbon dioxide were made, she said.

“Between 1958 and 2011 (the last year for which scientists have complete data), the carbon dioxide concentration has increased from 316 parts per million to 390 ppm,” she said. “Our experiments showed that higher carbon dioxide compensated for reductions in growth of winter wheat due to drought. Wheat that grew under elevated carbon dioxide (2.4 times ambient) and drought yielded as well as wheat that grew under the ambient level carbon dioxide and well-watered conditions.”

The research showed that sorghum and winter wheat used water more efficiently as a result of the increased levels of carbon dioxide in the atmosphere, Kirkham said. Because elevated carbon dioxide closes stomata (pores on the leaves through which water escapes), less water is used when carbon dioxide levels are elevated. Evapotranspiration is decreased.

Studies done subsequent to the early work confirmed the findings.

Over the past few months, the researcher said she’s heard people comparing the dry summer of 2012 with the Dust Bowl years of the 1930s and the drought of the mid-1950s in Kansas.

The first accurate measurements of CO2 levels were made in 1958, so while scientists do not know what the concentration of CO2 was in the 1930s, Kirkham said, she used the data that she and her students collected to calculate how much the water use efficiency of sorghum has increased since 1958, which was about the time of the middle of 1950s drought.

“Due to the increased carbon dioxide concentration in the atmosphere, it now takes 55 milliliters (mL) less water to produce a gram of sorghum grain than it did in 1958,” she said. “Fifty-five mL is equal to about one-fourth of a cup of water. This may not seem like a lot of water savings, but spread over the large acreage of sorghum grown in Kansas, the more efficient use of water now compared to 1958 should have a large impact.

“The elevated carbon dioxide in the atmosphere in 2012 ameliorated the drought compared to the drought that occurred in the mid-1950s.”

At the basis of Kirkham’s book are experiments that she and other researchers conducted in the Evapotranspiration Laboratory at K-State from 1984-1991.

“They were the first experiments done in the field in a semi-arid region with elevated carbon dioxide,” Kirkham said. The lab no longer exists, but the work continues.

----------

ICECAP NOTE: This is confirmed by looking at corn yields in 2012, down sharply from 2010 but notice well above the drought levels of 1988 and the 1950s when drought was similar. Part o the difference in hybrids and better farming practices, but CO2 experiments have confirmed the K-State study’s findings.

image
Enlarged

image
Enlarged

NOTE: CO2Science and the Idsos have shown this to be the case in papers and experiments on their excellent website O2Science.org.  The family of PhD scientists have done excellent work on ll aspects of the science and Craig Idso is prime editor of the NIPCC Report, a compilation of reviews of many hundreds of peer review papers with findings that challenge the so called ‘settled science’. 

Page 16 of 251 pages « First  <  14 15 16 17 18 >  Last »