Frozen in Time
Aug 20, 2010
Challenges and innovation in agriculture

The Scientific Alliance Newsletter

The Royal Society has done us all a service by making freely available a special issue of the Philosophical Transactions on the theme ‘Food Security: feeding the world in 2050’. 21 articles cover a wide range of issues, from population and food consumption trends, through crop, livestock and fishery yields to other important factors such as globalization, water availability and waste. Coincidentally, a few months ago, the neighbouring Royal Academy of Engineering published a report on ‘Global Water Security - an engineering perspective’. Since farming is reckoned to consume about 70% of available fresh water globally, this is one of the key issues underpinning food security.

The Royal Society publication is particularly valuable in providing a broad overview of a key issue which is vital to our continuing prosperity. The price effect of Russia’s recent effective ban on grain exports was dramatic. Although this might be seen as an overreaction by the market, given that the supply situation is nowhere near as bad as that which led to the steep rises in 2008, it is yet another illustration of the vital nature of food security. We in the prosperous North take it for granted, but minor crises like this provide a salutary lesson: take away an assured and affordable food supply and our day-to-day priorities change dramatically. If things were to get really bad, modern societies would simply not be able to function. Things may have changed a lot since the days of hunter/gathering communities, but that simple fact remains the same.

Given the range of views on the future of the food supply - from the neo-Malthusians who still believe the high productivity necessary to sustain the current population is unsustainable, through advocates of low-input, extensive farming which is promoted as the solution, if we would only all become vegetarians, to agronomists who see no reason why plant breeding should not continue to deliver the yield increases necessary - it is good to see an objective scientific assessment.

Keith Jaggard and colleagues from Rothamsted Research consider a world of 9.1 billion people, with atmospheric carbon dioxide at 550ppm, ozone at 60ppb and average temperatures 2C higher. Given the recent failure of the global climate to conform to the pattern modelled by the IPCC, this temperature may be a significant overestimate, but nonetheless is still the received wisdom. Under this scenario, many crops (but not C4 ones) would see a yield increase of 13% due to carbon dioxide fertilisation, partly offset by a 5% decline because of higher ozone levels.

They also see opportunities for increased yields from projected temperature increases in some areas, with the most difficult challenge coming from soil pathogens, for which they foresee transgenic solutions. On top of this, there is still a significant yield gap between what the average farmer harvests and the best local benchmark. Even in efficient intensive farming systems this may be 20%; in developing countries it is usually much bigger, but more easily bridged via better fertilisation and crop protection. Overall, their conclusion is cautiously optimistic: “If this gap is closed and accompanied by improvements in potential yields then there is a good prospect that crop production will increase by approximately 50 per cent or more by 2050 without extra land.”

image

Animal products are also of course an important component of the diet in the industrialised world, and pastoralism remains vital for many poor farming families in developing countries. Philip Thornton of the International Livestock Research Institute in Nairobi sees potential for increased productivity from breeding (including transgenics), and improved nutrition and animal health. The relative balance of arable and livestock farming is itself also an important issue, since considerably more land and water is needed to produce each kilo of animal protein.

Concerns remain about the sustainability of wild fish stocks, but John Bostock and colleagues from the University of Stirling review the growth of aquaculture (which in 2007 accounted for 43% of global fish and shellfish production) and foresee further expansion. This would be facilitated by greater use of non-carnivorous species, reducing the reliance of fishmeal for species such as salmon.

To set against this broadly positive background, Kenneth Strzepek (University of Colorado/MIT) and Brent Boehlert (Industrial Economics Inc) project an 18% reduction in the worldwide availability of water for agriculture by mid-century, with much more dramatic localised effects. This is based on climate change modelling together with increased demands for municipal and industrial water and the need not to cause environmental damage by extracting too much of the rainfall runoff.

This should come as no surprise, as water shortages have been highlighted as an issue for many years. To some extent, more efficient use of current resources via techniques such as drip irrigation may help, but these require a level of sophistication which is not available everywhere. But on a more general level, the issue is not about water per se, but about water of sufficient purity. Adaptation of crops for more saline environments could be one part of the solution. More broadly, large-scale desalination could make up the balance, given a sufficiently economic energy supply. In many areas of the Mediterranean, Tropics and sub-Tropics this could be a major use for solar power in coastal areas.

The Royal Academy of Engineering’s report covers much broader issues of water supply and management, focussing on better capture and utilisation. Addressing the issue of food production, they recommend further work on “water efficiency in agriculture through water management and drainage and improved surface irrigation alongside drought-heat tolerant crop varieties (in parallel with improvements in plant breeding or genetic manipulation to reduce irrigation demand).” They recognise the challenge, but also see that it can be tackled.

We can see that there are real challenges ahead to provide food security for a larger population by 2050, but also that these challenges are not insurmountable. In the introductory paper of the Royal Society publication, a team of distinguished authors summarises many of the findings and comes to quite a positive conclusion. To quote:

“Another theme that emerges is the importance of taking a ‘competing risks’ approach to regulation in the food system - it is too easy to close off options by applying naive versions of the precautionary principle. The world is going to have to produce more food, and unless much of the Earth’s remaining biodiversity is to be destroyed, this will need to be done without expanding the area under cultivation. Achieving higher yields from the same acreage without severely impacting the environment requires a new way of approaching food production - sustainable intensification.”

Wise words.

Aug 17, 2010
Resource Sterilisation endangers National Security

By Viv Forbes, Carbon Sense Coalition

Extreme conservation policies are sterilising so much of Australia’s resources that it is becoming a threat to our national security. Most wars are about land and resources.

In the colonial era, aggressive Europeans swarmed into Africa, the Americas and Australia attracted by underused land, minerals and timber. More recently, Hitler invaded Eastern Europe and Russia in the search for “living space” and access to Black Sea oil and Japan went to war attracted by the resources of South East Asia and Australia.

Australia is the odd man of Asia - a huge land mass with a small population. Our populous and rapidly developing northern neighbours need the primary products that Australia has in abundance - food, fibres, minerals and energy. So they note with disbelief the way in which Australia is sterilising these valuable resources.

They see precious agricultural and forest land being swallowed by National Parks, World Heritage Reservations, Environmental Parks, Wild Rivers Declarations, Indigenous reservations and bans on land clearing. Unbelievably we have nine protected Wild Rivers, 11 World Heritage properties, 516 National Parks, 2,700 designated conservation areas and huge areas of government leasehold and aboriginal land. The latest proposal is a continuous conservation corridor running from Melbourne to Atherton. In all of these areas, agricultural and mining production are prohibited or increasingly restricted.

Our neighbours look on in amazement as foresters are locked out of State Forests, water courses become no-go zones for graziers and irrigation water is withdrawn from farmers and orchardists. Soon the whole Coral Sea will be locked up and beaches made off limits to fishermen. Future Australians are in danger of becoming a nation of peasants, poachers and smugglers in their own land.

Asia needs our abundant energy resources of coal, gas, oil shale and uranium. But they watch in disbelief as uranium mining is banned, gas is wasted in power generation, mining taxes are increased and there are threats to tax carbon and close our coal mines and power stations. History has no examples where a small number of self-indulgent people have managed to squat on valuable land and idle resources forever. And our historic protectors are no longer invincible - the Royal Navy no longer controls the Indian Ocean or the South China Sea and the US Navy is no longer unchallenged in the Pacific.

Today the refugee flotilla is unarmed. If we continue sterilising our resources of land, oceans, food, minerals and energy, future fleets may not submit peacefully to Australian boarding parties.

Farming Carbon Credits

The ALP has offered to buy votes from Australian farmers with carbon credits for growing trees. Farmers would be more impressed if the ALP offered to pay for the millions of carbon credits recently stolen from them using tree clearing bans and other land use restrictions. And once all farmland is covered by carbon credit forests, what shall we eat?

More Green Energy Dreams:

Zero Carbon Australia by 2020 Plan.

A Melbourne group calling itself “Beyond Zero Emissions” has produced a plan “Zero Carbon Australia - Stationary Energy Plan”. No coal power, petrol cars, diesel trucks or air trips. (Presumably they are also going to stop exhaling by 2020). 

This will be the new bible for anti-carbon energy crowd.

This plan has been evaluated by competent energy people who have revised the assumptions and cost estimates. They conclude: the ZCA2020 Stationary Energy Plan has significantly underestimated the cost and time scale required to implement such a plan. Our revised cost estimate is nearly five times higher than the estimate in the Plan: $1,709 billion compared to $370 billion.  The cost estimates are highly uncertain with a range of $855 billion to $4,191 billion for our estimate.

The wholesale electricity costs would increase nearly 10 times above current costs to $500/MWh, not the $120/MWh claimed in the Plan. The total electricity demand in 2020 is expected to be 44% higher than proposed: 449 TWh compared to the 325 TWh presented in the Plan.

The Plan has inadequate reserve capacity margin to ensure network reliability remains at current levels. The total installed capacity needs to be increased by 65% above the proposed capacity in the Plan to 160 GW compared to the 97 GW used in the Plan. The Plan’s implementation timeline is unrealistic.  We doubt any solar thermal plants, of the size and availability proposed in the plan, will be on line before 2020.  We expect only demonstration plants will be built until there is confidence that they can be economically viable.

The Plan relies on many unsupported assumptions, which we believe are invalid; two of the most important are:

1. A quote in the Executive Summary “The Plan relies only on existing, proven, commercially available and costed technologies.”
2. Solar thermal power stations with the performance characteristics and availability of baseload power stations exist now or will in the near future.

See here and here.

The Last Word on the Election.

If the ALP/Green Coalition wins this election, carbon taxes and emissions trading will suddenly rise from the dead. Our advice remains the same:

Number every square.

Put Climate Sceptics first and the Greens last.

Make your own choices from there on, but the Nationals, some Liberals and most of the other minor parties are strongly opposed to carbon Ration-N-Tax Schemes.  (We forgot to mention One Nation as another group sceptical of the idea that man’s production of carbon dioxide has harmful effects on anything.)

Aug 16, 2010
July Spot Check

By Dr. Richard Keen

Last month I sent a “spot check” of June’s measured temperature anomaly at the Coal Creek Canyon, Colorado, co-op station compared to NCDC’s analyzed temperature departure map for the month.  Interpolating the NCDC map to my location gave an analyzed departure of +4F, compared to the actual station departure of only +1.0F.

Here’s July’s map from NCDC’s site.

image

On the map, Coal Creek Canyon is above the 2F contour, with an analyzed departure of about 2.5 or 3 degrees F.  And the actual observed departure?  +0.3 degrees F.  Once again, NCDC adds 2 or 3 degrees to the observed temperature anomaly.

Could it just be this one station?  No, since another mountain site with a long record about 15 miles to my north (Jamestown) had a monthly departure of -0.5F (yes, Minus).  And the Denver airport site had +1.0F.

The NOAA headlines announce this as the 2nd warmest July on record nationally, but at Coal Creek July came in 12th place, out of 28 years of record.  July 2003 was 6 degrees warmer than this past July, and July 1992 (the year after Pinatubo) was 4 degrees cooler.

I’ll keep checking this in future months, but I suspect the story will remain the same, with mystery adjustments inflating the temperature departures in one direction only.

----------------

Mid-Season Update on the 2010 Hurricane Season
By Joseph D’Aleo

Both Drs. Gray and Klotzbach and NOAA and many in private industry have been (still are) predicting a very active Atlantic tropical season. Dry, dusty Saharan air and a confused pressure and upper air pattern has limited activity to date.  But usually in summers after El Nino winters with a warm Atlantic, activity does increase especially after mid-August. We may not reach the lofty levels of some early forecasts the pressure patterns are not yet ideal but don’t write off the season just yet as Dr Klotzback opined here

Update: See Dr. Lupo’s take on the early season here.

Since 1995, the Atlantic has become twice as active on average as the prior 25 years, similar to the period from 1930s to 1960s. This is due to a shift to the “warm” mode of the multi-decadal scale oscillation in the Atlantic Ocean. Most of the storms making landfall since 1995 have impacted the Mid-Atlantic region, Florida and the Gulf of Mexico. However, though not yet realized, history tells us that the risk has also increased for more populated areas to the north (Long Island and New England) (below, enlarged here and here).

image

image

Note the increase in the number of stronger storms after the flip back to the warm mode in 1995 (below, enlarged here and here).

image

image

Bill Gray has shown the number of land falling storms along the southeast coast increases in the warm mode (above). The same can be said for the Atlantic coast in general as shown below (enlarged here).

image

Those that survive, often recurve out to sea or escape the shear by taking a more southern route through the Caribbean to the Gulf, where most make landfall.

The landfall depends on the strength and position of the Bermuda High in the Atlantic. A tropical system will tend to turn north at first opportunity including a weakness in the Bermuda high. The stronger and farther west the high extends, the further west the storm moves before turning or landfall (enlarged here).

image

El Ninos tend to produce fewer storms by increasing the shear in the Atlantic, fed by increased Pacific tropical activity (enlarged here).

image

Those that survive often recurve out to sea or escape the shear by taking a more southern route through the Caribbean to the Gulf, where most make landfall (enlarged here).

image

La Ninas on the other hand show less shear, more storms and more landfalls. Weakness of pressure along the east coast favors more landfalls (enlarged here).

image

See the widepread spray of storms in La Nina late summers and fall (enlarged here).

image

The La Nina landfalls are in the Gulf, Southeast Coast and the Northeast/New England.

THE EFFECT OF THE PACIFIC DECADAL OSCILLATION

The PDO affects the relative frequency of the El Nino and La Nina. We turned to the colder mode of the PDO in the late 1990s, which tends to mean more La Ninas like the last cold mode 1947 to 1977 (enlarged here). 

image

See clearly the ENSO frequency shift with the PDO phases (enlarged here).

image

The area of the east coast north of Cape Hatteras risk is greatest during La Nina years when the PDO is negative and the Atlantic is warm (this summer!)
See how New England landfalls vary with the PDO (enlarged here).

image

La Nina Years Occurring with Warm Atlantic Summers

* 1938 Hurricane of ‘38 (CAT 5) New York and New England
* 1950 Hurricane Easy (CAT 3) Florida, Hurricane King (CAT 3) Florida
* 1954 Hurricane Carol (CAT 3) New York and New England, Hurricane Edna (CAT 3) New England, Hurricane Hazel (CAT 4) Mid-Atlantic and northeast
* 1955 Hurricane Connie (CAT 3) NC, VA, NY, New England Flooding, Hurricane Diane (Cat 1) NC, New England Flooding
* 1960 Hurricane Donna (CAT 4) FL, (CAT 3) NY, New England
* 1989 Hurricane Hugo (CAT 4) SC
* 1996 Hurricane Bertha (CAT 2) NC, Hurricane Fran (CAT 3) NC
* 1998 Hurricane Bonnie (CAT 2) NC
* 1999 Hurricane Floyd (CAT 2/3) NC
15 landfalling storms in the 9 years!!!!! 11 major hurricanes. 9 affected northeast

HOW THE TROPICAL ATLANTIC WARMED

Some of the warmth was due to a lack of activity last hurricane season. Hurricanes act to extract excess tropical ocean warmth (in the form of sensible and latent heat) and transfer it poleward (one of the compensation mechanisms that bring about equilibrium). Ironically also the result of the record negative AO and very cold Northern Hemisphere winter in the southern US, Europe, Russia, China. A record negative AO suppressed jet streams and subtropical highs in both oceans (enlarged here).

image

Lower wind speeds and reduced cloudiness due to subsidence in the deep subtropics led to warm water near the surface (bottom right) (enlarged here).

image

The very latest Atlantic sea surface temperature analysis from the Hurricane Center showing the warm water that will help feed the activity (enlarged here).

image

So all along the Gulf and East Coast should remain wary and watchful the next few months and not be lulled to a false sense of security by the deafening quiet this season thus far. Thought the cuurent conditions with an unfavorable Madden Julian Oscillation, a confused upper pattern and dry and dusty air patches are still impeding development, that is likely to change. Even if we end up at the lower end of the range of the forecast number of storms, one major landfall would make for a memorable season (as the Hurricane of ‘38).  See preliminary late May post here.

See Accuweather’s Joe Bastardi’s forecast here.

Aug 16, 2010
Tropical sea surface temperatures respond to natural changes in surface pressure across the globe

By Erl Happ

Tropical sea surface temperatures respond to the change in surface pressure across the globe and in particular to the differential between mid latitudes and the near equatorial zone. The southern hemisphere and high latitudes in particular experience marked flux in surface pressure. This leads directly to a change in the trade winds and tropical sea surface temperature.

There is an asymmetry between the hemispheres with loss of pressure in the southern hemisphere compensated to some extent by a gain in pressure in the northern hemisphere.

If we wish to understand the ENSO phenomenon we must look beyond the tropics for causal factors. ENSO in the Pacific is just one facet of change in the tropics. Change is driven by air pressure variations at mid and especially high latitudes. This determines the strength of the trade winds and the temperature of the tropical ocean (where solar insolation is greatest and cloud cover is least). There are knock on effects for heat transfer from the tropics to mid and high latitudes, rainfall, flood, drought and tropical cyclone activity worldwide. The tropical oceans represent the Earths solar array.

The flux in surface pressure appears to be cyclical. However, the cycle is longer than the sixty years of available data.  We cannot say for sure what the cycle length may be or how it varies over time. However, there is good evidence that the warming cycle that began in 1978 peaked in 1998 is now at an end.

We must acknowledge that the ENSO cycle is not temperature neutral. There are short ENSO cycles of just a few years and long ENSO cycles that are longer than 60 years (enlarged here).

image

Is there evidence that the activity of man (adding CO2 to the atmosphere) is tending to produce more severe El Nino events.  The answer is no. The flux in surface pressure is responsible for ENSO and for the swing from El Nino to La Nina dominance. In spite the activities of man, the globe is currently entering a La Nina cooling cycle testifying to the strength of natural cycles and the relative unimportance atmospheric composition in determining the issue (if the much touted greenhouse effect exists at all) .

Is there evidence that the ENSO phenomenon is in fact ‘climate change in action’, driven by factors other than the increase of atmospheric CO2? Yes, it appears that whatever drives the flux in surface atmospheric pressure drives ENSO and with it, climate change.

Is recent ‘Climate Change’ driven by greenhouse gas activity? No, it appears that the cause of recent warming and cooling relates to long-term swings in atmospheric pressure that changes the pressure relations between mid and low latitudes thereby affecting the trade winds that in turn determine the temperature of the Earth’s solar array, its tropical ocean, and ultimately the globe as a whole.

Aug 15, 2010
BREAKING: New paper makes a hockey sticky wicket of Mann

By Anthony Watts

Sticky Wicket ‘ phrase, meaning: “A difficult situation”.

Oh, my. There is a new and important study on temperature proxy reconstructions (McShane and Wyner 2010) submitted into the Annals of Applied Statistics and is listed to be published in the next issue. According to Steve McIntyre, this is one of the “top statistical journals”. This paper is a direct and serious rebuttal to the proxy reconstructions of Mann. It seems watertight on the surface, because instead of trying to attack the proxy data quality issues, they assumed the proxy data was accurate for their purpose, then created a bayesian backcast method. Then, using the proxy data, they demonstrate it fails to reproduce the sharp 20th century uptick.

Now, there’s a new look to the familiar “hockey stick”.

Before:

image
Enlarged here.. Multiproxy reconstruction of Northern Hemisphere surface temperature variations over the past millennium (blue), along with 50-year average (black), a measure of the statistical uncertainty associated with the reconstruction (gray), and instrumental surface temperature data for the last 150 years (red), based on the work by Mann et al. (1999). This figure has sometimes been referred to as the hockey stick. Source: IPCC (2001).

After:

image
Enlarged here.. Backcast from Bayesian Model of Section 5. CRU Northern Hemisphere annual mean land temperature is given by the thin black line and a smoothed version is given by the thick black line. The forecast is given by the thin red line and a smoothed version is given by the thick red line. The model is fit on 1850-1998 AD and backcasts 998-1849 AD. The cyan region indicates uncertainty due to t, the green region indicates uncertainty due to β, and the gray region indicates total uncertainty.

Not only are the results stunning, but the paper is highly readable, written in a sensible style that most laymen can absorb, even if they don’t understand some of the finer points of bayesian and loess filters, or principal components. Not only that, this paper is a confirmation of McIntyre and McKitrick’s work, with a strong nod to Wegman. I highly recommend reading this and distributing this story widely.

Here’s the submitted paper:

A Statistical Analysis of Multiple Temperature Proxies: Are Reconstructions of Surface Temperatures Over the Last 1000 Years Reliable? (PDF)

image
Enlarged here..
FIG 15. In-sample Backcast from Bayesian Model of Section 5. CRU Northern Hemisphere annual mean land temperature is given by the thin black line and a smoothed version is given by the thick black line. The forecast is given by the thin red line and a smoothed version is given by the thick red line. The model is fit on 1850-1998 AD.

We plot the in-sample portion of this backcast (1850-1998 AD) in Figure 15. Not surprisingly, the model tracks CRU reasonably well because it is in-sample. However, despite the fact that the backcast is both in-sample and initialized with the high true temperatures from 1999 AD and 2000 AD, it still cannot capture either the high level of or the sharp run-up in temperatures of the 1990s. It is substantially biased low. That the model cannot capture run-up even in-sample does not portend well for its ability to capture similar levels and run-ups if they exist out-of-sample.

Conclusion.

Research on multi-proxy temperature reconstructions of the earth’s temperature is now entering its second decade. While the literature is large, there has been very little collaboration with universitylevel, professional statisticians (Wegman et al., 2006; Wegman, 2006). Our paper is an effort to apply some modern statistical methods to these problems. While our results agree with the climate scientists findings in some respects, our methods of estimating model uncertainty and accuracy are in sharp disagreement.

On the one hand, we conclude unequivocally that the evidence for a “long-handled” hockey stick (where the shaft of the hockey stick extends to the year 1000 AD) is lacking in the data. The fundamental problem is that there is a limited amount of proxy data which dates back to 1000 AD; what is available is weakly predictive of global annual temperature. Our backcasting methods, which track quite closely the methods applied most recently in Mann (2008) to the same data, are unable to catch the sharp run up in temperatures recorded in the 1990s, even in-sample.

As can be seen in Figure 15, our estimate of the run up in temperature in the 1990s has a much smaller slope than the actual temperature series. Furthermore, the lower frame of Figure 18 clearly reveals that the proxy model is not at all able to track the high gradient segment. Consequently, the long flat handle of the hockey stick is best understood to be a feature of regression and less a reflection of our knowledge of the truth. Nevertheless, the temperatures of the last few decades have been relatively warm compared to many of the thousand year temperature curves sampled from the posterior distribution of our model.

Our main contribution is our efforts to seriously grapple with the uncertainty involved in paleoclimatological reconstructions. Regression of high dimensional time series is always a complex problem with many traps. In our case, the particular challenges include (i) a short sequence of training data, (ii) more predictors than observations, (iii) a very weak signal, and (iv) response and predictor variables which are both strongly autocorrelated.

The final point is particularly troublesome: since the data is not easily modeled by a simple autoregressive process it follows that the number of truly independent observations (i.e., the effective sample size) may be just too small for accurate reconstruction.

Climate scientists have greatly underestimated the uncertainty of proxy based reconstructions and hence have been overconfident in their models. We have shown that time dependence in the temperature series is sufficiently strong to permit complex sequences of random numbers to forecast out-of-sample reasonably well fairly frequently (see, for example, Figure 9). Furthermore, even proxy based models with approximately the same amount of reconstructive skill (Figures 11,12, and 13), produce strikingly dissimilar historical backcasts: some of these look like hockey sticks but most do not (Figure 14).

Natural climate variability is not well understood and is probably quite large. It is not clear that the proxies currently used to predict temperature are even predictive of it at the scale of several decades let alone over many centuries. Nonetheless, paleoclimatoligical reconstructions constitute only one source of evidence in the AGW debate. Our work stands entirely on the shoulders of those environmental scientists who labored untold years to assemble the vast network of natural proxies. Although we assume the reliability of their data for our purposes here, there still remains a considerable number of outstanding questions that can only be answered with a free and open inquiry and a great deal of replication.

Read much more and the comments on WUWT here.
----------

Commenters on WUWT report that Tamino and Romm are deleting comments even mentioning this paper on their blog comment forum. Their refusal to even acknowledge it tells you it has squarely hit the target, and the fat lady has sung - loudly.

Page 155 of 309 pages « First  <  153 154 155 156 157 >  Last »