Political Climate
Mar 11, 2010
Barcelona hit with heaviest snowfall in 25 years

UK Telegraph

Snowfalls of up to 50 centimetres (20 inches) were forecast for the worst affected areas of the region of Catalonia, prompting the regional government to cancel classes for more than 142,000 students at 476 public schools.

Power was lost in homes throughout the region, with energy company Fecsa-Endesa reporting 200,000 clients without electricity, mostly in the province of Girona.

image

services workers helped evacuate some 500 passengers who became trapped on trains traveling between Barcelona and Portbou, on the French border, which became stuck due to the lack of power, said regional interior minister Joan Boada.

Thousands of commuters were left scrambling for an alternative way to get home after the blizzard forced the suspension of bus services in Barcelona and the closure of five suburban train lines in the Mediterranean port city.

Barcelona city hall ordered the metro system to stay open all night to help people move around the city. Traffic on over 60 roads in Catalonia was either prohibited or restricted. Spain’s border with France at La Jonquera was closed because of the snow, leaving some 4,000 trucks stranded, public television TVE reported.

While Barcelona’s El Prat airport was operating normally, 21 flights out of the airport in nearby Girona were cancelled and nine others were diverted to other cities due to the snow and strong winds, airport officials said. See story here.

See here how worst snow on March 8 since 1962/63 brings Barcelona to a complete collapse. Spring is around the corner, yet it was minus 6ºC last night in Madrid. The region of Catalonia had the worst snow storm in 25 years. Barcelona received enough snow to cause a collapse in traffic, and its beaches were covered in the white stuff. It is worse in the rest of the region, which is struggling under up to 60 cm of snow (more in the mountains). Snow tires are unknown in the country, so people struggle to put on chains when they are stranded ( not an easy job, and your fingers freeze quickly while doing it ). Thirty three high tension pylons collapsed under high winds which also caused 7-meter-high waves to crash on the coast. Two hundred thousand homes are still without electricity. Thousands of truck were stuck on the roads, and traffic jams on the motorway to France were 50 km long. Snow in the streets turned into ice, causing multiple accidents, and making sidewalks dangerous to walk on.

This is the third serious white winter Northern Spain had in a row. Further south, in Andalusia, it’s been raining for 3 months, and they hadn’t seen so much water in decades. The countryside is flooded in many parts, and some old houses have collapsed. Locals hadn’t experienced such inundations since the great one of 1963. See more here and here.

See in this story how the snow closed the border between France and Spain. Here is a photo from Elne in southern France.
image

----------------------

European Storms
By World Climate Report

The winter of 2009-2010 has produced its fair share of winter storms in the Northern Hemisphere - recall that President Obama arrived back in Washington from his appearance at the Copenhagen climate conference only to find the White House grounds buried under near-record amounts of snow. Europe and Asia have seen their share of large winter storms as well during the 2009-2010 winter. Hardly a large storm goes by without someone, somewhere suggesting that whatever we are seeing, it is related to “climate change”. If one looked no further than the Technical Summary of the IPCC, they would discover that the IPCC is rather quiet on this subject with no claims whatsoever that winter storms will increase in frequency, magnitude, duration, or intensity due to the ongoing changes in atmospheric composition.

Two new articles are out that further confirm that global warming has not and will not be causing mid-latitude winter storms to become some new destructive result of the greenhouse effect.

The first is by two scientists from Sweden and Poland, the article appears in the International Journal of Climatology, and the work was funded by the European Community. The title lets us know the authors are dealing with Scandinavian storminess back to 1780; one line in the abstract caught our attention as it says “We find pronounced interdecadal variability in cyclonic activity but no significant overall consistent long-term trend.”

Barring and Fortuniak begin their piece noting that “Long-term variability in extra-tropical cyclone frequency and strength is at the centre of international attention, both because of scientific theoretical analysis of climate change and because of socio-economic consequences of potential changes in storminess activity.” They state that these storms cause severe damage to property and forests and that storm surges associated with intense storms are damaging as well. They quite correctly remind us that “The question is whether changes to such storminess characteristics are a result of changes in frequency and intensity of deep cyclones in exposed regions. The essential problem is thus if any changes to cyclone activity are within natural variability or not, that is, the classical problem of climate change detection. As intense cyclones and severe windstorms are comparatively rare events, long-term records are required to capture the natural variability.”

We completely agree with their assessment - long-term records provide a necessary dimension of historical perspective and can often make any recent trends look very different. Barring and Fortuniak note that “Several studies using reanalysis data covering the second half of the 20th century suggest increasing storm intensity in the northeastern Atlantic and European sector.” So to extend the record back in time, they examined the thrice-daily sea level atmospheric pressure measurements taken in Stockholm and Lund. Their statistical methodology was complicated, but they showed that their method could clearly identify large storms in the past.

The authors present many graphs of storm activity, but the one below is probably the best one for capturing temporal variations in storm activity. The graph shows principal component scores taken from the Lund and Stockholm datasets; the values are standardized with a mean of zero and a standard deviation of one; higher positive values are associated with periods with large storms and large negative values show periods of low storm activity. Imagine if the dataset began in 1950...the upward trend would lack the necessary context and perspective and might seem unusual–which, as the longer perspective shows, it is not.

image
Figure 1. Time evolution of principal component scores from the Lund and Stockholm datasets (from Barring and Fortuniak, 2009) enlarged here.

Barring and Fortuniak do not waste any words with their conclusions (the Dalton minimum occurred between 1790 and 1830, and was a time of low sunspot activity). They state “(1) There is no significant overall long-term trend common to all indices in cyclone activity in the North Atlantic and European region since the Dalton minimum. (2) The marked positive trend beginning around 1960 ended in the mid-1990s and has since then reversed. This positive trend was more an effect of a 20th century minimum in cyclone activity around 1960, rather than extraordinary high values in 1990s. Both the 1960s minimum and the 1990s maximum were within the long-term variability. (3) Because the period between the 1960s minima and the 1990s maxima spans a substantial part of the period covered by most reanalysis datasets, any analysis relying solely on such data is likely to find trends in cyclone activity and related measures.”

Our next article on the subject was published recently in Quaternary International by scientists with the UK’s University of Nottingham and Loughborough University. Their interest lies with western European coastlines and storms over a variety of timescales. Clarke and Rendell begin stating “There is growing evidence that periods of sand drift and dune development provide proxy records of the impacts of storms in coastal areas”. Furthermore, “An understanding of the patterns of past storminess is particularly important in the context of future anthropogenically driven climate change, with predictions of increased storm frequency and sea level rise by the end of the current century”. Once again, someone is out there suggesting an increase in storms frequency thanks to global warming.

They examine many other studies in their review article, and conclude (LIA is the Little Ice Age) “The analysis of documentary records, discontinuous instrumental data and proxy records indicate that the period of the LIA (AD 1570–1900) included periods of enhanced storminess relative to present. This increased storminess coincided with numerous episodes of sand drift and dune building along the western European coast, as demonstrated by both documentary records and independent dating of sand movement.” Furthermore, Clarke and Rendell found “The Holocene record of sand drift in western Europe includes episodes of movement corresponding to periods of Northern Hemisphere cooling, particularly 8.2 ka, and provides the additional evidence that these periods, like the LIA, were also stormy”

In other words, these articles suggest that it is the cooler periods in Europe that were stormier, not the warmer ones.

References:

Barring, L. and K. Fortuniak. 2009. Multi-indices analysis of southern Scandinavian storminess 1780–2005 and links to interdecadal variations in the NW Europe - North Sea region. International Journal of Climatology, 29, 373-384.

Clarke, M.L. and H.M. Rendell. 2009. The impact of North Atlantic storminess on western European coasts: A review. Quaternary International, 195, 31-41. 



Mar 11, 2010
Independent body to review UN climate panel

Australian Broadcasting Corporation

United Nations chief Ban Ki-Moon has announced a respected international body will conduct an independent review of UN climate science after a global warming report was found to have errors.

The Intergovernmental Panel on Climate Change (IPCC) has admitted its 2007 report exaggerated the pace of Himalayan glaciers melting. But Mr Ban says the errors should not affect the conclusion that human activities are changing the climate and that greenhouse gas emissions should be cut urgently.

“The threat posed by climate change is real, and nothing that has been alleged or revealed in the media recently alters the fundamental scientific consensus on climate change,” he said. Mr Ban told reporters that the Amsterdam-based InterAcademy Council (IAC), which groups presidents of 15 leading science academies, will carry out the task “completely independently of the United Nations”.

The IPCC’s chairman, Rajendra Pachauri, says he hopes the inquiry will restore public trust. “In recent months we have seen some criticism,” he said.
“We are receptive and sensitive to that and we are doing something about it. I am very grateful that the secretary-general has very kindly supported this initiative.” Dr Pachauri pledged that an upcoming fifth assessment report by the IPCC would be “stronger and better than anything we have produced in the past”.

image

The IAC’s co-chairman, Robbert Dijkgraaf, told reporters the panel aims to present its report by the end of next August so that governments can consider it ahead of key climate change meetings late this year. He says the review will focus on what procedures were used. “It will definitely not go over all the data, the vast amount of data in climate science,” he said. “What it will do [is] it will see what the procedures are and how they can be improved. So looking forward, how can we avoid that certain types of errors are made.” See post here.



Mar 11, 2010
Klotzbach etal Paper - Further Explored

By Joseph D’Aleo

Our recent SPPI paper covered the many issues with the data including station dropout, missing data, bad siting (largely the result of the modernization), instrument biases, and then the adjustments which dozens of peer review papers show are important and many show could account for up to 50% of the claimed warming since 1900. See this response to NOAA and the EPA that includes that here.

The station dropout issue is not new. I wrote about it in the 1990s in the first generation Intellicast blog and this story in 2001 by former NOAA, NASA scientist and later chief scientist at Raytheon, Dr. Doug Hoyt put it this way “support for this idea comes from the fact that 135 stations in the USSR ceased observing at the end of 1989. Subsequently there appeared to be a warming in the USSR but this warming is not supported by pressure observations. Thus, it appears half or more of the reported global warming from ground observations is arising from this change in station coverage. It is possible that as much as 0.2 C of the 0.25 C warming for 1979-1999 can be explained by this change in stations, although more study is required to refine this number.”

PEER REVIEW SUPPORT FOR SURFACE DATA ISSUES

When the satellites were first launched, their temperature readings were in better agreement with the surface station data. There has been increasing divergence over time which can be seen below (derived from Klotzbach, et al 2009). In the first plot, we see the temperature anomalies as computed from the satellites and assessed by UAH and RSS and the station based land surface anomalies from NOAA/(NCDC). That increased divergence is clear from the figure below (enlarged here).

image

The Klotzbach paper finds that the divergence between surface and lower-tropospheric trends is consistent with evidence of a warm bias in the surface temperature record but not in the satellite data (below, enlarged here).

image
NOAA annual land temperatures minus annual UAH lower troposphere (blue line) and NOAA annual land temperatures minus annual RSS lower troposphere (green line) differences over the period from 1979 to 2008

Klotzbach et. al. described an ‘amplification’ factor for the lower troposphere as suggested by Santer et al (2005) and Santer et al (2008) due to greenhouse gas trapping relative to the warming at the surface. Santer refers to the effect as “tropospheric amplification of surface warming”.  This effect is a characteristic of all of the models used in the UNIPCC and the USGRCP “ensemble” of models by Karl, et.al. (2006) which was the source for Karl et al (2009) which in turn was relied upon by EPA in its recent Endangerment Finding.( Federal Register / Vol. 74, No. 239 / Tuesday, December 15, 2009 / Rules and Regulations at 66510 )

As John Christy describes it “The amplification factor is a direct calculation from model simulations that show over 30 year periods that the upper air warms at a faster rate than the surface - generally 1.2 times faster for global averages.  This is the so-called “lapse rate feedback” in which the lapse rate seeks to move toward the moist adiabat as the surface temperature rises.  In models, the convective adjustment is quite rigid, so this vertical response in models is forced to happen.  The real world is much less rigid and has ways to allow heat to escape rather than be retained as models show.” This latter effect has been documented by Chou and Lindzen (2005) and Lindzen and Choi (2009) .

The amplification factor was calculated from the mean and median of the 19 GCMs that were in the CCSP SAP 1.1 report (Karl et al, 2006).  A fuller discussion of how the amplification factor was calculated is available in the Klotzbach paper here.

The ensemble model forecast curve (upper curve) in figure (below, enlarged here). was calculated by multiplying the NOAA NCDC surface temperature for each year by the amplification factor, since this would yield the model projected tropospheric temperature. The lower curves are the actual UAH and RSS lower tropospheric satellite temperatures.

image

The total divergence of the observed NOAA temperature and satellite temperature difference from the model forecast trends is depicted in the figure (below, enlarged here).

image

These figures strongly suggest that instead of atmospheric warming from greenhouse effects dominating, surface based warming due to factors such as urbanization and land use changes are driving the observed changes. Since these surface changes are not adjusted for, neither trends from the surface networks nor forecasts from the models can be considered reliable.

This is why the NOAA and NASA press releases should be ignored. The surface based data sets have become seriously flawed and they and the climate models can no longer be trusted for climate trend assessment.



Page 346 of 645 pages « First  <  344 345 346 347 348 >  Last »