Frozen in Time
Apr 08, 2011
Misleading language

Scientific Alliance

Use of language is one of the main factors which defines humanity. At its best, it can not only express our deepest feelings and be a source of great beauty, but also put across complex concepts with clarity and lack of ambiguity. However, language can also be misused and be deliberately misleading. Most obviously, this is in the form of propaganda, but more subtle misuse can be just as bad. This is as true in the case of science as for politics, finance or other areas.

It is often assumed that misuse of a concept can change its meaning quite easily, by simple repetition. There are two ways of looking at this. Lenin is quoted as saying “a lie told often enough becomes the truth”, whereas Franklin Roosevelt took a different view when he said “repetition does not transform a lie into a truth”. Although apparently incompatible, each is equally valid in its own way. The Bolshevik view, unfortunately, tends to reflect real human behaviour: if people only hear a single view they tend - at least superficially - to accept it as the truth.

But Roosevelt’s more idealistic interpretation is equally well-founded because, although there may be general acceptance of an officially-sanctioned version of the truth, the fundamental reality does not change. Anyone who wants to look at the evidence rather than accept seemingly authoritative statements can discover the underlying truth for themselves.

Take, for example, the term ‘carbon dioxide pollution’, which has become commonplace. The Oxford dictionary defines pollution as ‘the presence in or introduction into the environment of a substance which has harmful or poisonous effects’. This seems fairly unambiguous, and the only argument about, for example, sub-micron carbon particulates in the air, copper and other heavy metals in the soil or harmful bacteria in water would be about the maximum acceptable level. There can be little doubt that each is a form of pollution and may be harmful.

Carbon dioxide, on the other hand, is vital to life on Earth. Without it, plants could not photosynthesise. Without photosynthesis, there would be no oxygen. Without oxygen, there would be no life apart from anaerobic bacteria. To consider it to be a pollutant therefore seems somewhat perverse.

The reason, of course, is that computer modelling based on the enhanced greenhouse hypothesis projects potentially significant increases to global temperatures, with major impacts on weather patterns and sea level which could compromise the lives of whole swathes of the population. And, although its contribution to warming is lower than water vapour, carbon dioxide is more persistent in the atmosphere and all the evidence is that burning fossil fuels is causing a fairly consistent year-on-year increase.

For those who consider the enhanced greenhouse effect to be the most plausible explanation of the way the temperature record has evolved over the last decades (or even for those who are not wholly convinced but believe that the consequences of taking no action could be disastrous), it is a natural step to emphasise their view in language which the public understands and will not simply ignore. Hence, a small but steady increase in the atmospheric level of a trace gas essential for life has become ‘pollution’. Repeated often enough, this has become a term which is used unquestioningly, but the underlying facts are unchanged for those who care to look.

There are other examples, including ‘addiction’ to oil. Turning back to the Oxford dictionary, addicted is defined as ‘physically and mentally dependent on a particular substance’. In a narrow sense, modern societies could be seen as addicted to oil (or, more broadly, energy) since they are indeed physically dependent. But if we say this, we would have to agree that we are also addicted to food, warmth and oxygen. Nevertheless, politicians have brought the phrase into common use in an effort to promote a transition from fossil fuels to renewable sources of energy.

Use of renewable energy is a key part of the modern drive for sustainability. The appropriate dictionary definition of sustainable is ‘conserving an ecological balance by avoiding depletion of natural resources’, while according to the Brundtland Commission in 1987, ‘sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.’

This is a tricky concept, and one about which there is significant disagreement. In most circles, it is accepted that there are three primary components: environmental, social and economic. However, there are many people on the more radical wing of the environmental movement who believe that economic growth is in itself the problem and is intrinsically unsustainable. They envisage some post-industrial utopia and would like to see emerging economies such as China avoid the energy-dependent growth which the industrialised world has experienced (to the great benefit of their populations).

Even those who take a more balanced view of sustainability see progress occurring on a steady and pre-ordained path, with the future essentially being more of the same. Experience shows that life is not like that. Progress is catalysed by a series of disruptive innovations or events which change the nature of society. The evolution of farming was one, and arguably still the most significant. Harnessing the energy from coal, oil and gas was certainly another game-changer, and the rapid development of solid state electronics, computers and communications networks has been the most recent major trend to change our way of life fundamentally.

The concept of long-term sustainability is deeply flawed. Nevertheless, it embodies plenty of self-evident commonsense in the short term. Farmers must maintain the health and productivity of their soil if they are to grow crops consistently year after year. Societies must ensure an adequate supply of clean water to cope with demands for the foreseeable future. They must also provide secure energy supplies to their populations, but this security is already being compromised by present moves towards so-called sustainable renewable energy sources.

The list of misleading language could go on. Its use is only likely to increase, as language is one of the most powerful weapons people can employ. The big question is whether the effect is as Lenin suggested, or whether FDR was closer to the truth. Are people genuinely misled, or do they make up their own minds if they see the evidence differently? Everyday conversations and consumer surveys would suggest that in many cases Roosevelt was - thankfully - more accurate.

But this should not make use blind to the dangers of simply taking news stories or political speeches at face value. In democratic countries, there seems little danger of governments deliberately taking us down a path towards some kind of Orwellian Newspeak, but there is an insidious focus on ‘correct’ terminology from a range of interest groups. The lesson for all of us must be to look behind the words.

Apr 06, 2011
Another Active Hurricane Season - with impact potential

Forecast Update - April 6, 2011 by Dr Phil Klotzbach and Dr. William Gray, CSU

We continue to foresee well above-average activity for the 2011 Atlantic hurricane season. Our seasonal forecast has been reduced slightly from early December, since there is a little uncertainty about ENSO and the maintenance of anomalously warm tropical Atlantic SST conditions. We continue to anticipate an above-average probability of United States and Caribbean major hurricane landfall.

image

Information obtained through March 2011 indicates that the 2011 Atlantic hurricane season will have significantly more activity than the average 1950-2000 season.

We estimate that 2011 will have about 9 hurricanes (average is 5.9), 16 named storms (average is 9.6), 80 named storm days (average is 49.1), 35 hurricane days (average is 24.5), 5 major (Category 3-4-5) hurricanes (average is 2.3) and 10 major hurricane days (average is 5.0).

The probability of U.S. major hurricane landfall is estimated to be about 140 percent of the long-period average. We expect Atlantic basin Net Tropical Cyclone (NTC) activity in 2011 to be approximately 175 percent of the long-term average.

We have decreased our seasonal forecast slightly from early December, due to anomalous warming in the eastern and central tropical Pacific and cooling in the tropical Atlantic.

This forecast is based on a new extended-range early April statistical prediction scheme that utilizes 29 years of past data. Analog predictors are also utilized. We expect current La Nina conditions to transition to near-neutral conditions during the heart of the hurricane season. Overall, conditions remain conducive for a very active hurricane season.

PROBABILITIES FOR AT LEAST ONE MAJOR (CATEGORY 3-4-5) HURRICANE LANDFALL ON EACH OF THE FOLLOWING COASTAL AREAS:

1) Entire U.S. coastline - 72% (average for last century is 52%)

2) U.S. East Coast Including Peninsula Florida - 48% (average for last century is 31%)

3) Gulf Coast from the Florida Panhandle westward to Brownsville - 47% (average for last century is 30%)

PROBABILITY FOR AT LEAST ONE MAJOR (CATEGORY 3-4-5) HURRICANE TRACKING INTO THE CARIBBEAN (10-20N, 60-88W)

1) 61% (average for last century is 42%

All the detail behind the forecast can be found in the PDF.

See my discussion on the upcoming season here. Here is an account of the Hurricane of ‘38, a La Nina summer.

Apr 04, 2011
Is There A Sampling Bias In The BEST Analysis Reported By Richard Muller?

By Roger Pielke Sr., Climate Science blog

In his testimony Richard Muller (which I posted on Friday April 2 2011), indicated that he used 2% of the available surface stations that measure temperatures in the BEST assessment of long-term trends. It is important to realize that the sampling is still biased if a preponderance of his data sources comes from a subset of actual landscape types.  The sampling will necessarily be skewed towards those sites.

If the BEST data came from a different distribution of locations than the GHCNv.2, however, then his results would add important new insight into the temperature trend analyses. If they have the same spatial distribution, however, they would not add anything beyond confirming that NCDC, GISS and CRU were properly using the collected raw data.

We discuss this bias in station locations in our paper

Montandon, L.M., S. Fall, R.A. Pielke Sr., and D. Niyogi, 2011: Distribution of landscape types in the Global Historical Climatology Network. Earth Interactions, 15:6, doi: 10.1175/2010EI371

The abstract reads [highlight added]

“The Global Historical Climate Network version 2 (GHCNv.2) surface temperature dataset is widely used for reconstructions such as the global average surface temperature (GAST) anomaly. Because land use and land cover (LULC) affect temperatures, it is important to examine the spatial distribution and the LULC representation of GHCNv.2 stations. Here, nightlight imagery, two LULC datasets, and a population and cropland historical reconstruction are used to estimate the present and historical worldwide occurrence of LULC types and the number of GHCNv.2 stations within each. Results show that the GHCNv.2 station locations are biased toward urban and cropland (>50% stations versus 18.4% of the world’s land) and past century reclaimed cropland areas (35% stations versus 3.4% land). However, widely occurring LULC such as open shrubland, bare, snow/ice, and evergreen broadleaf forests are underrepresented (14% stations versus 48.1% land), as well as nonurban areas that have remained uncultivated in the past century (14.2% stations versus 43.2% land). Results from the temperature trends over the different landscapes confirm that the temperature trends are different for different LULC and that the GHCNv.2 stations network might be missing on long-term larger positive trends. This opens the possibility that the temperature increases of Earth’s land surface in the last century would be higher than what the GHCNv.2-based GAST analyses report.”

This derived surface temperature trends is higher than what BEST found.  However, this also means that the divergence between the surface temperature trends and the lower tropopsheric temperature trends that we found in

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

is even higher.  This difference suggests that unresolved issues, including a likely systematic warm bias, remains in the analysis of long term surface temperature trends.

Apr 01, 2011
Forecasting Expert Calls for End to Government-Funded Research on Global Warming

Heartland Press Release

In testimony yesterday before the Subcommittee on Energy and Environment Committee on Science, Space and Technology, forecasting expert J. Scott Armstrong of the Wharton School at the University of Pennsylvania called on Congress to cease funding global warming research, programs, and advocacy organizations.

Referring to an analysis he conducted with Kesten C. Green of the University of South Australia and Willie Soon of the Harvard-Smithsonian Center for Astrophysics, Armstrong told the subcommittee, “We approach the issue of alarm over dangerous manmade global warming as a problem of forecasting temperatures over the long term. The global warming alarm is not based on what has happened, but on what will happen. In other words, it is a forecasting problem. And it is a very complex problem.”

The three researchers audited the forecasting procedures used by the Intergovernmental Panel on Climate Change (IPCC), whose “procedures violated 81% of the 89 relevant forecasting principles,” Armstrong noted.

Armstrong and his colleagues recommend Congress end government funding for climate change research as well as other research, government programs, and regulations that assume the planet is warming. They also recommend Congress cease funding organizations that lobby or campaign for global warming.

“Based on our analyses, especially with respect to the violations of the principles regarding objectivity and full disclosure,” Armstrong told members of Congress, “we conclude that the manmade global warming alarm is an anti-scientific political movement.”

Armstrong can be reached for further comment at 610-622-6480 or armstrong@wharton.upenn.edu. A copy of the report he submitted to the committee is available online .

Mar 30, 2011
EPA Air Chief Is CO2 Clueless

By Art Horn, ICECAP meteorologist on Pajamas Media

The situation in Japan is awful on multiple fronts, and the Japanese face a recovery that will challenge the limits of their capabilities. Yet back here at home, the EPA is aiming to melt down our feeble economic situation by taxing everything that produces energy - not because of anything like radiation, but due to harmless carbon dioxide.

I thought I had heard just about everything in the great global warming debate until, on March 1, the House of Representatives held a hearing dealing with the EPA’s proposal to commence sweeping nationwide regulations on greenhouse gas emissions. The idea behind the EPA’s plan is to regulate (read: “tax") the largest “polluters” to reduce the amount of carbon dioxide emitted into the air. The EPA believes that by reducing carbon dioxide by some tiny fraction it can control the Earth’s climate system - a typical government mentality, no? Doing this would result in higher costs of doing business for the nation’s power generating facilities and manufacturing plants. It would increase the cost of doing business across the board for many other smaller businesses, and would likely inhibit companies from hiring new employees.

In February, Republican Fred Upton said:

Needless to say the Chinese government and other competitors have no intention of burdening and raising the cost of doing business for their manufacturers and energy producers the way EPA plans to do here in America. Our goal should be to export goods, not jobs.

During the EPA hearings, Rep. Joe Barton (R-TX) questioned Gina McCarthy, EPA chief of air programs and greenhouse gas regulations. He asked:

Do you know what the level of CO2 right now is generally speaking in the atmosphere?

He threw her a softball. Her answer?

Well actually I don’t have that figure.

(Take long pregnant pause here.)

You don’t have what? Please tell me this is a joke! The chief of the EPA’s air programs and greenhouse gas regulations doesn’t know how much carbon dioxide is in the air? This is beyond anything I thought was possible. This is a person leading the United States of America? Don’t tell our enemies.

What if Joe Barton was to ask a scientist who did not believe in manmade global warming that kind of question?

Barton: Mr. Horn, do you know what the instrument we use to measure temperature is, generally speaking?

Me: I don’t have that information at this time.

That’s how bad Gina McCarthy’s answer was. To be the chief regulator of such a potentially devastating policy and to not know that CO2 is 390 parts per million? What else doesn’t the EPA know about carbon dioxide? How about that it’s not a dangerous pollutant?

We have been told Ms. McCarthy is highly trained and unquestionably qualified for the job. Says Scientific American:

Although Ms. McCarthy has a tough road ahead, her experience and achievements prove she will rise to the challenge. She has shown true leadership in Connecticut and Massachusetts implementing a multi-pollutant approach to clean up the air in those states.

Let’s talk about that leadership: as head of Connecticut’s Department of Energy and Power, Gina McCarthy helped develop the Regional Greenhouse Gas Initiative, the nation’s first mandatory cap-and-trade program. That program, called RGGI, is actually a tax on each resident and business owner in Connecticut and the other 9 RGGI member states. The charge for operating RGGI is hidden within complicated verbiage of electric bills each month.

Maybe that’s part of the reason New Hampshire just voted to drop out of the system, and more states are looking to follow New Hampshire’s lead.

There are two constants in the universe. The first is change, and the second is the bureaucratic mentality. Ms. McCarthy actually said:

I never really thought of myself as a regulator. I actually am a strong believer in markets. I really think our job is to make sure that the work we do is valued and priced in the markets appropriately. And so I am a true believer in democracy - in having government intervene when it needs to and not when it doesn’t.

Really. It would seem that she believes in the markets only when it is convenient to her bureaucratic, unscientifically biased agenda.

Is the United States going to keep up with China and maintain our leadership role in the world? Not if the EPA keeps looking for ways to handicap us. We need to get our heads out of the clouds and start developing our own resources - the most abundant in the world. Nations such as China, India, and others are unburdened by the carbon dioxide ball-and-chain.

They are forging ahead with a clear vision of the future while we are regulating ourselves into mediocrity.

Read more here.

Art Horn spent 25 years working in television as a meteorologist. He now is an independent meteorologist and speaker who lives in Connecticut. He can be contacted at skychaserman@cox.net.

Page 124 of 309 pages « First  <  122 123 124 125 126 >  Last »