|
Apr 11, 2013
NOAA Scientist Rejects Global Warming Link to Tornadoes
By James Rosen
A top official at the National Oceanic and Atmospheric Administration (NOAA) rejected claims by environmental activists that the outbreak of tornadoes ravaging the American South is related to climate change brought on by global warming.
Greg Carbin, the warning coordination meteorologist at NOAA’s Storm Prediction Center in Norman, Oklahoma, said warming trends do create more of the fuel that tornadoes require, such as moisture, but that they also deprive tornadoes of another essential ingredient: wind shear.
“We know we have a warming going on,” Carbin told Fox News in an interview Thursday, but added: “There really is no scientific consensus or connection [between global warming and tornadic activity....Jumping from a large-scale event like global warming to relatively small-scale events like tornadoes is a huge leap across a variety of scales.”
Asked if climate change should be “acquitted” in a jury trial where it stood charged with responsibility for tornadoes, Carbin replied: “I would say that is the right verdict, yes.+ Because there is no direct connection as yet established between the two? “That’s correct,” Carbin replied.
Formerly the lead forecaster for NOAA’s Storm Prediction Center, Carbin is a member of numerous relevant professional societies, including the National Weather Association, the American Meteorological Society, the Union of Concerned Scientists, and the International Association of Emergency Managers. He has also served on the peer review committee for the evaluation of scientific papers submitted to publications like National Weather Digest and Weather and Forecasting.
This evaluation by a top NOAA official contradicted pronouncements by some leading global warming activists, who were swift to link 2011’s carnage to man-made climate change.
“The earth is warming. Carbon emissions are increasing,” said Sarene Marshall, Managing Director for The Nature Conservancy’s Global Climate Change Advocacy Team. “And they both are connected to the increased intensity and severity of storms that we both are witnessing today, and are going to see more of in the coming decades.:”
Bjorn Lomborg of the Copenhagen Consensus Center, an activist and author who believes industrialized societies expend too much money and energy combating global warming, instead of focusing on more immediate, and easily rectifiable, problems, doubted the tornadoes have any link to warming trends.
“We’ve seen a declining level of the severe tornadoes over the last half century in the U.S.,” Lomborg told Fox News."So we need to be very careful not just to jump to the conclusion and say, “Oh, then it’s because of global warming.“‘
In fact, NOAA statistics show that the last 60 years have seen a dramatic increase in the reporting of weak tornadoes, but no change in the number of severe to violent ones.
Enlarged
For many, the high casualties of 2011 recalled the so-called “Super Outbreak” of April 1974, which killed more than 300 people. “You have to go back to 1974 to even see a tornado outbreak that approaches what we saw yesterday,” W. Craig Fugate, administrator of the Federal Emergency Management Agency (FEMA), told Fox News.
Asked earlier, during a conference call with Alabama Gov. Robert Bentley about the possibility that climate change is playing a role in the tornado outbreak, Fugate shot back: “Actually, what we’re seeing is springtime. Unfortunately, many people think of the Oklahoma tornado alley and forget that the Southeast U.S. actually has a history of longer and more powerful tornadoes that stay on the ground longer—and we are seeing that, obviously, in 2011”
The activity this week with 21 tornadoes was caused like in 2011 by very strong contrast with unseasonable cold and snow north and warmth south. This season is unlikely to be as severe as 2011 when we were coming off the second strongest La Nina with a powerful La Nina jet stream. It will be more active than 2012, a very warm spring with little contrast and very few tornadoes.
Enlarged
Apr 05, 2013
Matt Ridleys diary: My undiscovered island, and the Met Office’s computer problem
Matt Ridley, the Spectator
We’ve discovered that we own an island. But dreams of independence and tax-havenry evaporate when we try to picnic there on Easter Sunday: we watch it submerge slowly beneath the incoming tide. It’s a barnacle-encrusted rock, about the size of a tennis court, just off the beach at Cambois, north of Blyth, which for some reason ended up belonging to my ancestor rather than the Crown. Now there’s a plan for a subsidy-fired biomass power station nearby that will burn wood (and money) while pretending to save the planet. The outlet pipes will go under our rock and we are due modest compensation. As usual, it’s us landowners who benefit from renewable energy while working people bear the cost: up the coast are the chimneys of the country’s largest aluminium smelter killed, along with hundreds of jobs, by the government’s unilateral carbon-floor price in force from this week.
Weatherbell.com Year-to-Date Anomaly
FSU’s D.r Bob Hart Daily Snow Anomalies for the Northern Hemisphere
There were dead puffins on the beach, as there have been all along the east coast. This cold spring has hit them hard. Some puffin colonies have been doing badly in recent years, after booming in the 1990s, but contrary to the predictions of global warming, it’s not the more southerly colonies that have suffered most. The same is true of guillemots, kittiwakes and sandwich terns: northern colonies are declining.
It’s not just here that the cold has been relentless. Germany’s average temperature for March was below zero. Norwegian farmers cannot plant vegetables because the ground’s frozen three feet down. In America snow fell as far south as Oklahoma last week. It’s horrible for farmers. But in past centuries, bad weather like that of the past 12 months would kill. In the 1690s, two million French people starved because of bad harvests. I’ve never understood why people argue that globalisation makes for a more fragile system: the opposite is the case. Harvest failures can be regional, but never global, so world trade ensures that we have the insurance policy of access to somebody else’s bumper harvest.
Gloriously, the poor old Met Office got it wrong yet again. In December it said: ‘For February and March… above-average UK-mean temperatures become more likely.’ This time last year it said the forecast ‘slightly favours drier-than-average conditions for April May June, and slightly favors April being the driest of the three months’ before the wettest of all Aprils. The Met Office does a great job of short-term forecasting, but the people who do that job must be fed up with the reputational damage from a computer that’s been taught to believe in rapid global warming. In September 2008 it foretold a ‘milder than average’ winter, before the coldest winter in a decade. The next year it said ‘the trend to milder and wetter winters is expected to continue’ before the coldest winter for 30 years. The next year it saw a ‘60 per cent to 80 per cent chance of warmer-than-average temperatures this winter’ before the coldest December since records began. ICECAP NOTE: Pierre Gosselin has compiled 57 failed winter forecasts by warmists among hundreds.
At least somebody’s happy about the cold. Gary Lydiate runs one of Northumberland’s export success stories, Kilfrost, which manufactures 60 per cent of Europe’s and a big chunk of the world’s aircraft de-icing fluid, so he puts his money where his mouth is, deciding how much fluid to send to various airports each winter. Back in January, when I bumped into him in a restaurant, he was beaming: ‘Joe says this cold weather’s going to last three months,’ he said. Joe is Joe Bastardi, a private weather forecaster, who does not let global warming cloud his judgment. Based on jetstreams, El Ninos and ocean oscillations, Bastardi said the winter of 2011/12 would be cold only in eastern Europe, which it was, but the winter of 2012/13 would be cold in western Europe too, which it was. He’s now predicting ‘warming by mid month’ of April for the UK.
David Rose of the Mail on Sunday was vilified for saying that there’s been no global warming for about 16 years, but even the head of the Intergovernmental Panel on Climate Change now admits he’s right. Rose is also excoriated for drawing attention to papers which find that climate sensitivity to carbon dioxide is much lower than thought - as was I when I made the same point in the Wall Street Journal. Yet even the Economist has now conceded this. Tip your hat to Patrick Michaels, then of the University of Virginia, who together with three colleagues published a carefully argued estimate of climate sensitivity in 2002. For having the temerity to say they thought ‘21st-century warming will be modest’, Michaels was ostracised. A campaign began behind the scenes to fire the editor of the journal that published the paper, Chris de Freitas. Yet Michaels’s central estimate of climate sensitivity agrees well with recent studies. Scientists can behave remarkably like priests at times.
Joe Bastardi, Ryan Maue and I work together at Weatherbell Analytics. Weatherbell also predicted the turn to colder in the US this late winter and has hit every major snowstorm, tornado outbreak, and hurricane landfall since its inception in 2011. We offer a blog service for enthusiasts as well as specialized services to commercial markets internationally like Killfrost and many of the better more successful and forward thinking hedge (energy and ag) funds, snow clients, insurance and other weather sensitive industries. Ryan has developed a very impressive value added model and data section available to all premium and commercial clients. Go to weatherbell.com.
Apr 03, 2013
We’re not screwed?
Ross McKitrick, Special to the Financial Post
Enlarged
11,000-year study’s 20th-century claim is groundless
On March 8, a paper appeared in the prestigious journal Science under the title A reconstruction of regional and global temperature for the past 11,300 years. Temperature reconstructions are nothing new, but papers claiming to be able to go back so far in time are rare, especially ones that promise global and regional coverage.
The new study, by Shaun Marcott, Jeremy Shakun, Peter Clark and Alan Mix, was based on an analysis of 73 long-term proxies, and offered a few interesting results: one familiar (and unremarkable), one odd but probably unimportant, and one new and stunning. The latter was an apparent discovery that 20th-century warming was a wild departure from anything seen in over 11,000 years. News of this finding flew around the world and the authors suddenly became the latest in a long line of celebrity climate scientists.
The trouble is, as they quietly admitted over the weekend, their new and stunning claim is groundless. The real story is only just emerging, and it isn’t pretty.
The unremarkable finding of the Marcott et al. paper was that the Earth’s climate history since the end of the last ice age looks roughly like an upside down-U shape, starting cold, warming up for a few thousand years, staying warm through the mid-Holocene (6,000 to 9,000 years ago), then cooling steadily over the past five millennia to the present. This pattern has previously been found in studies using ground boreholes, ice cores and other very long-term records, and was shown in the first IPCC report back in 1990. Some studies suggest it was, on average, half a degree warmer than the present, while others have put it at one or even two degrees warmer. A lot of assumptions have to be made to calibrate long-term proxy measures to degrees Celsius, so it is not surprising that the scale of the temperature axis is uncertain.
Another familiar feature of long-term reconstructions is that the downward-sloping portion has a few large deviations on it. Many show a long, intense warm interval during Roman times 2,000 years ago, and another warm interval during the medieval era, a thousand years ago. They also show a cold episode called the Little Ice Age ending in the early 1800s, followed by the modern warming. But the Marcott et al. graph didn’t have these wiggles, instead it showed only a modest mid-Holocene warming and a smooth decline to the late 1800s. This was odd, but probably unimportant, since they also acknowledged using so-called “low frequency” proxies that do not pick up fluctuations on time scales shorter than 300 years. The differences between the scale of their graph and that of others could probably be chalked up to different methods.
The new, and startling, feature of the Marcott graph was at the very end: Their data showed a remarkable uptick that implied that, during the 20th century, our climate swung from nearly the coldest conditions over the past 11,500 years to nearly the warmest. Specifically, their analysis showed that in under 100 years we’ve had more warming than previously took thousands of years to occur, in the process undoing 5,000 years’ worth of cooling.
This uptick became the focus of considerable excitement, as well as scrutiny. One of the first questions was how it was derived. Marcott had finished his PhD thesis at Oregon State University in 2011 and his dissertation is online. The Science paper is derived from the fourth chapter, which uses the same 73 proxy records and seemingly identical methods. But there is no uptick in that chart, nor does the abstract to his thesis mention such a finding.
Stephen McIntyre of climateaudit.org began examining the details of the Marcott et al. work, and by March 16 he had made a remarkable discovery. The 73 proxies were all collected by previous researchers, of which 31 are derived from alkenones, an organic compound produced by phytoplankton that settles in layers on ocean floors, and has chemical properties that correlate to temperature. When a core is drilled out, the layers need to be dated. If done accurately, the researcher could then interpret the alkenone layer at, say, 50 cm below the surface, to imply (for example) the ocean temperature averaged 0.1 degrees above normal over several centuries about 1,200 years ago. The tops of cores represent the data closest in time to the present, but this layer is often disturbed by the drilling process. So the original researchers take care to date the core-top to where the information begins to become useful.
According to the scientists who originally published the alkenone series, the core tops varied in age from nearly the present to over a thousand years ago. Fewer than 10 of the original proxies had values for the 20th century. Had Marcott et al. used the end dates as calculated by the specialists who compiled the original data, there would have been no 20th-century uptick in their graph, as indeed was the case in Marcott’s PhD thesis. But Marcott et al. redated a number of core tops, changing the mix of proxies that contribute to the closing value, and this created the uptick at the end of their graph. Far from being a feature of the proxy data, it was an artifact of arbitrarily redating the underlying cores.
Worse, the article did not disclose this step. In their online supplementary information the authors said they had assumed the core tops were dated to the present “unless otherwise noted in the original publication.” In other words, they claimed to be relying on the original dating, even while they had redated the cores in a way that strongly influenced their results.
Meanwhile, in a private email to McIntyre, Marcott made a surprising statement. In the paper, they had reported doing an alternate analysis of their proxy data that yielded a much smaller 20th century uptick, but they said the difference was “probably not robust,” which implied that the uptick was insensitive to changes in methodology, and was therefore reliable. But in his email to McIntyre, Marcott said the reconstruction itself is not robust in the 20th century: a very different thing. When this became public, the Marcott team promised to clear matters up with an online FAQ.
It finally appeared over the weekend, and contains a remarkable admission: “[The] 20th-century portion of our paleotemperature stack is not statistically robust, cannot be considered representative of global temperature changes, and therefore is not the basis of any of our conclusions.”
Now you tell us! The 20th-century uptick was the focus of worldwide media attention, during which the authors made very strong claims about the implications of their findings regarding 20th-century warming. Yet at no point did they mention the fact that the 20th century portion of their proxy reconstruction is garbage.
The authors now defend their original claims by saying that if you graft a 20th-century thermometer record onto the end of their proxy chart, it exhibits an upward trend much larger in scale than that observed in any 100-year interval in their graph, supporting their original claims. But you can’t just graft two completely different temperature series together and draw a conclusion from the fact that they look different.
The modern record is sampled continuously and as a result is able to register short-term trends and variability. The proxy model, by the authors’ own admission, is heavily smoothed and does not pick up fluctuations below a time scale of several centuries. So the relative smoothness in earlier portions of their graph is not proof that variability never occurred before. If it had, their method would likely not have spotted it.
What made their original conclusion about the exceptional nature of 20th-century warming plausible was precisely the fact that it appeared to be picked up both by modern thermometers and by their proxy data. But that was an illusion. It was introduced into their proxy reconstruction as an artifact of arbitrarily redating the end points of a few proxy records.
In recent years there have been a number of cases in which high-profile papers from climate scientists turned out, on close inspection, to rely on unseemly tricks, fudges and/or misleading analyses. After they get uncovered in the blogosphere, the academic community rushes to circle the wagons and denounce any criticism as “denialism.” There’s denialism going on all right - on the part of scientists who don’t see that their continuing defence of these kinds of practices exacts a toll on the public credibility of their field.
Financial Post
Ross McKitrick is professor of economics and CME fellow in sustainable commerce at the Department of Economics, University of Guelph.
Note: The original Marcotte findings were in line with the ice core data from Greenland (Richard Alley - no skeptic)
Enlarged
Mar 30, 2013
New study finds increased CO2 ameliorates effect of drought on crops
Hockey Schtick
Elevated Carbon Dioxide in Atmosphere Trims Wheat, Sorghum Moisture Needs
Mar. 25, 2013 - Plenty has been written about concerns over elevated levels of carbon dioxide in Earth’s atmosphere, but a Kansas State University researcher has found an upside to the higher CO2 levels. And it’s been particularly relevant in light of drought that overspread the area in recent months.
“Our experiments have shown that the elevated carbon dioxide that we now have is mitigating the effect that drought has on winter wheat and sorghum and allowing more efficient use of water,” said K-State agronomy professor Mary Beth Kirkham.
Kirkham, who has written a book on the subject, “Elevated Carbon Dioxide: Impacts on Soil and Plant Water Relations,” used data going back to 1958. That’s when the first accurate measurements of atmospheric carbon dioxide were made, she said.
“Between 1958 and 2011 (the last year for which scientists have complete data), the carbon dioxide concentration has increased from 316 parts per million to 390 ppm,” she said. “Our experiments showed that higher carbon dioxide compensated for reductions in growth of winter wheat due to drought. Wheat that grew under elevated carbon dioxide (2.4 times ambient) and drought yielded as well as wheat that grew under the ambient level carbon dioxide and well-watered conditions.”
The research showed that sorghum and winter wheat used water more efficiently as a result of the increased levels of carbon dioxide in the atmosphere, Kirkham said. Because elevated carbon dioxide closes stomata (pores on the leaves through which water escapes), less water is used when carbon dioxide levels are elevated. Evapotranspiration is decreased.
Studies done subsequent to the early work confirmed the findings.
Over the past few months, the researcher said she’s heard people comparing the dry summer of 2012 with the Dust Bowl years of the 1930s and the drought of the mid-1950s in Kansas.
The first accurate measurements of CO2 levels were made in 1958, so while scientists do not know what the concentration of CO2 was in the 1930s, Kirkham said, she used the data that she and her students collected to calculate how much the water use efficiency of sorghum has increased since 1958, which was about the time of the middle of 1950s drought.
“Due to the increased carbon dioxide concentration in the atmosphere, it now takes 55 milliliters (mL) less water to produce a gram of sorghum grain than it did in 1958,” she said. “Fifty-five mL is equal to about one-fourth of a cup of water. This may not seem like a lot of water savings, but spread over the large acreage of sorghum grown in Kansas, the more efficient use of water now compared to 1958 should have a large impact.
“The elevated carbon dioxide in the atmosphere in 2012 ameliorated the drought compared to the drought that occurred in the mid-1950s.”
At the basis of Kirkham’s book are experiments that she and other researchers conducted in the Evapotranspiration Laboratory at K-State from 1984-1991.
“They were the first experiments done in the field in a semi-arid region with elevated carbon dioxide,” Kirkham said. The lab no longer exists, but the work continues.
----------
ICECAP NOTE: This is confirmed by looking at corn yields in 2012, down sharply from 2010 but notice well above the drought levels of 1988 and the 1950s when drought was similar. Part o the difference in hybrids and better farming practices, but CO2 experiments have confirmed the K-State study’s findings.
Enlarged
Enlarged
NOTE: CO2Science and the Idsos have shown this to be the case in papers and experiments on their excellent website O2Science.org. The family of PhD scientists have done excellent work on ll aspects of the science and Craig Idso is prime editor of the NIPCC Report, a compilation of reviews of many hundreds of peer review papers with findings that challenge the so called ‘settled science’.
Mar 27, 2013
Wall Street Journal’s Crony Capitalism Conference turns sour
by Myron Ebell on March 27, 2013
Times have changed since the Wall Street Journal held its first “ECO:nomics-Creating Environmental Capital” conference at the super-swanky Bacara Resort in Santa Barbara. I was there in 2008 (but, alas, stayed at the Best Western in downtown Santa Barbara) when several hundred investors and corporate CEOs listened to leading crony capitalists, including Jeff Immelt of GE, James Rogers of Duke Energy, Andrew Liveris of Dow Chemical, and John Doerr of Kleiner, Perkins, Caulfield and Byers (where Al Gore was also a partner), smugly explain how they were going to strike it rich off the backs of consumers and taxpayers with green energy subsidies and mandates, federal loan guarantees, and the higher energy prices that would make renewable energy competitive with coal, oil, and natural gas once cap-and-trade was enacted.
This year’s sixth annual conference, which I didn’t attend, was also held at the Bacara Resort, but the mood was apparently different. Yesterday, the Journal ran a six-page supplement that summarized the conference’s highlights. The lead article by John Bussey was headlined: “Green Investing: So Much Promise, So Little Return: At The Wall Street Journal’s ECO:nomics conference, the talk was about all the innovations taking place in renewable energy-and about all the investors who are losing interest.”
Bussey writes: “Given all the interest in protecting the environment from mankind’s rapid advance, you’d think this might be the best time ever to invest in renewable energy and the Next Big Green Thing. Guess again. Large parts of green-tech investment look like the torched and salted fields left behind by Roman conquerors: barren, lifeless - and bereft of a return on capital. Put another way: In some areas, if you aren’t already investor road kill, you’re likely the hedgehog in the headlights about to join your maker.”
On page two, an article on a talk by John Dears, chief investment officer of the California Public Employees’ Retirement System (or Calpers), reveals that their “fund devoted to clean energy and technology which started in 2007 with $460 million has an annualized return of minus 9.7% to date.” Dears is quoted as telling the conference: “We have almost $900 million in investment expressly aimed at clean tech. We’re all familiar with the J-curve in private equity. Well, for Calpers, clean-tech investing has got an L-curve for “lose.” Our experience is that this has been a noble way to lose money.”
Yes, con artists gaming the system to raise energy prices, impoverish consumers, destroy jobs, and fleece taxpayers can still take comfort that theirs is “a noble way to lose money.” May it long remain so. The entire 2013 ECO:nomics program may be found here. Read it and gloat now - it may be the last one.
----------
Meanwhile, John Coleman talks about an amazing potential for graphene, basically a form of the demonized element carbon
KUSI.com - KUSI News - San Diego CA - News, Weather, PPR
|
|
|