The right strategy wins the war WeatherShop.com Gifts, gadgets, weather stations, software and more...click here!\
The Blogosphere
Tuesday, April 12, 2011
Science without method

By John Nicol, Doomed Planet

Global warming research: whatever happened to the scientific method?

Global warming, and its euphemistic sibling “climate change”, remain much in the news. Specialist research groups around the world continue to produce an unending sequence of papers aimed at demonstrating a litany of problems which might arise should global warming resume. The authors’ prime expertise is often found to be not in atmospheric physics or aeronomy, as one might have anticipated. However, the topic of climate change itself provides for abundant research funding, from which they feed, more easily than other areas of research of greater interest and practical use. Most of these papers are, of course, based upon the output from speculative and largely experimental, atmospheric models representing exercises in virtual reality, rather than observed, real-world, measurements and phenomena. Which leads to the question “What scientific methodology is in operation here?”

Though much has been written concerning the scientific method, and the ill defined question as to what constitutes a correct scientific approach to a complex problem, comparatively little comment has been made about the strange mix of empirical and virtual reality reasoning that characterises contemporary climate change research. It is obvious that the many different disciplines described as being scientific, rather than social, economic, or of the arts, may apply somewhat different criteria to determine what fundamental processes should define the “scientific method” as applied for each discipline. Dismayingly, for many years now there has been a growing tendency for many formerly “pure” scientific disciplines to embody characteristics of many others, and in some cases that includes the adoption of research attitudes and methods that are more appropriately applied in the arts and social sciences. “Post-modernism”, if you like, has proved to be a contagious disease in academia generally.

Classical scientific method generally follows the simple protocol of first defining an hypothesis concerning the behavior or cause of some phenomenon in nature, either physical, biological or chemical. In most well defined areas of research, previous theory and experiment may provide such a wide and complex corpus of knowledge that a new hypothesis is not easily nor singly defined, and may even be left unstated.

This is most commonly the case when a number of diverse disciplines, all important for attaining an understanding of a particular problem, are providing results which lead to contradicting conclusions. A contemporary example of this is discussions of the greenhouse effect, which is one of the most controversial topics ever to be considered within the scientific community. Conventional thinking on the greenhouse effect is encapsulated in the IPCC’s statement that “We believe that most of the increase in global temperatures during the second half of the twentieth century, were very likely due to the increases in the concentration of atmospheric carbon dioxide”.

Clearly this statement would be better worded were it to have been framed as a hypothesis rather than a belief, and treating the statement that way allows it to be rigorously tested ("beliefs", which are unable to be tested, fall outside of the spectrum of science). In the real scientific world, for such an hypothesis to survive rigorous scrutiny, and thereby to perhaps grow in strength from a hypothesis to a theory, requires that it be examined and re-examined from every possible angle over periods of decades and longer.

In conventional research, the next step - following the formulation of the hypothesis in whatever form it may take - is to select what measurements or analyses need to be done in order to test the hypothesis and thus to advance understanding of the topic. Most often, theoretical reasoning as to why an hypothesis might be correct or incorrect is followed by the development of experiments in laboratories, or the making of careful observations in nature, which can be organized and classified, and from which measurements can be made and conclusions drawn.

The theoretical analysis may be qualitative, highly structured or written in terms of a precise mathematical formalism, which provides a basis for describing a model or picture of the phenomenon and the behavior of observables associated with it. We may for instance, choose to include conjecture on quantities which are hidden from observation but whose presence and effects may be simply understood through the measured behavior of larger-scale observables. As work progresses, theoretical reasoning and experiment work in harmony, one or other progressing foremost at a given time, but they are inevitably locked together by the need to represent experimental observations and results in theoretical terms. The mandatory requirement in all of this is that all aspects of an hypothesis (and nascent theory) must be justifiable, meaning justified in terms of observation and measurement.

Examples of this process abound in physics where theoretical reasoning and experiment, have, since the days of Newton, Faraday and the other great masters, marched together in step. In some cases theory trailed the evidence, in others the reverse. Here are some historical examples, based upon familiar cases which are frequently referred to even in the popular press.

The development by Max Planck of the mathematical law that represents the natural distribution of radiation from a hot cavity followed a very long process of developing theoretical explanations to best fit the evidence from measurements that had been made as long as thirty years before. Intermediate steps along the way included the approximate formulae of Wein and Rayleigh, both of which provided reasonably accurate representations when Planck’s formula was applied to the extreme limits of the radiation field, but which proved to be less powerful and general in application than Planck’s final formulation. Later gestation of Plank’s embryo discovery led to the development of the powerful and technologically empowering quantum mechanics and electrodynamics theories. Along the way, again, there was a trail of experiments which sprang from the stimulation of theoretical reasoning yet simultaneously raised questions which required understanding through mathematical analysis. During all of this, mutually supportive yet competitive research, observation, experiment and theoretical reasoning, played off one another in a constructive, “scientific” fashion, leading eventually to the development of the laser, the transistor, the scanning electron microscope, the MRI, cat scan, cochlear hearing, the super and desk top computers and countless other modern technical and medical machines. Good, fundamentally sound scientific discoveries are gifts beyond price, that simply keep on giving.

The same type of history applied in the development of the acknowledged triumph that was Einstein’s theory of special relativity. Einstein’s breakthrough was built on the strong, earlier theoretical work of Lorenz and others, who provided the formalism that allowed the classical electromagnetic equations of James Clerk Maxwell to be seen as a universal description of electromagnetism. Thus was provided a basic description of the actions of light, X-rays and radar, while at the same time sweeping away a long held consensual belief in the luminescent ether, an erstwhile all pervading but undetectable substance that was proscribed as the carrier of light from the stars, the sun and man on earth. Looking back with the benefit of hindsight, “belief” in the ether shows many similarities with the strong belief that some modern scientists espouse that dangerous global warming is being caused by human-related carbon dioxide emissions.

Maxwell’s equations grew directly from Faraday’s classical experiments. Einstein’s theory placed these equations in a universal framework. Quantum electrodynamics harnessed Maxwell and Planck into modern quantum theory, while relativity itself remains largely a curiosity in spite of its importance in placing other theories and experimental results into a robust description of important aspects of nature. Nevertheless, all of these theories, and the results of early experiments which sought either to justify or destroy them, continue to be placed under the most careful contemporary research scrutiny, on scales that range from examination of the smallest particles in nature to the cosmological description of the universe.

As long as 70 years after the acclaimed publication of Einstein’s Relativity theory, experiments were still being conducted using refined laser techniques in attempts to show that some vital predictions flowing from this theory were invalid. Over the same period, conceptual acceptance occurred of the implications of the so-called “Twin Paradox” which also followed from this theory, which led to a Herculean battle between two unforgiving protagonists, H.W. McRea and Herbert Dingle, in a series of articles published in Nature. Subsequently, and appropriately, the answer was defined by an experiment - to whit, the measurement of the decay of a radioactive sample traveling by satellite. Einstein’s less commonly described General Theory of Relativity is still undergoing constant review in most of its parts - as, to some minor extent, is the special theory also. The important point to absorb here is that ALL scientific results are provisional, as well encapsulated by Nobel Prize winner Richard Feynmann’s immortal observation that “A scientist is someone who believes in the ignorance of experts”.

In all of this turbulent and adversarial history of fundamental truths of nature, there has been robust argument amongst many varied scientists and research groups, but against a background of respect between all participants. There has always been a liberal sharing of ideas, at the same time as a strong defense was offered by an individual of his own. The incontrovertible historic evidence of the rapid exchange of ideas by means of publications of short letters and articles during these phases of scientific creativity is proof positive of the importance of freedom of data sharing and information at a level of which many modern scientists can only dream.

That some meteorological agencies and “leading” climate scientists believe that they have the right to deny provision of their data to other credentialed scientists is but one of the signs of a research pathology that characterizes many powerful, contemporary climate science teams. 

Electromagnetism, relativity and quantum mechanics are now very generally accepted, not least because they have been thoroughly tested by a large range of differing experimental approaches. Nonetheless, controversies still exist in discussions of the fine details of the theory, while at the same time the results of more exotic measurements are seriously questioned if they do not fit the theory. This process of continuous testing of both theory and experiment has the tremendous benefit in stimulating further experiments and differing hypotheses which often lead to new discoveries. In all of this, the Plancks, Bohrs, Diracs, Einsteins, Heisenbergs and Schrodingers stood way out from the crowd, as do their modern counterparts to this very day. 

Just as there were debates on relativity, there were also some serious sceptics in the quantum world, some of whom used the debate between Einstein and Bohr on the interpretation of the equations of Quantum Mechanics to discredit it. In the end, Bohr’s so-called Copenhagen interpretation prevailed, which describes a quantum world of probabilities that lie within a proscribed range of an unrelenting uncertainty. Einstein’s claim that “Der liebe Gott würfelt nicht mit der Welt” now has few supporters, but followed him to the grave. Among other notable antagonists of quantum theory, Professor Frederick Lindemann (Viscount Cherwell) at the Clarendon Laboratory in Oxford, would not allow the teaching of quantum mechanics in his classes right up until the 1940s

Out of this cut and paste “history” of physics, comes the strongest criticism of the mainstream climate science research as it is carried on today. The understanding of the climate may appear simple compared to quantum theory, since the computer models that lie at the heart of the IPCC’s warming alarmism don’t need to go beyond Newtonian Mechanics. However, the uncertainty in Quantum Mechanics which Einstein was uncomfortable with, was about 40 orders of magnitude (i.e. 10^40) smaller than the known errors inherent in modern climate “theory”. Yet in contemporary research on matters to do with climate change, and despite enormous expenditure, not one serious attempt has been made to check the veracity of the numerous assumptions involved in greenhouse theory by actual experimentation.

The one modern, definitive experiment, the search for the signature of the green house effect has failed totally. Projected confidently by the models, this “signature” was expected to be represented by an exceptional warming in the upper troposphere above the tropics. The experiments, carried out during twenty years of research supported by The Australian Green House Office as well as by many other well funded Atmospheric Science groups around the world, show that this signature does not exist. Where is the Enhanced Green House Effect? No one knows.

In addition, the data representing the earth’s effective temperature over the past 150 years, show that a global human contribution to this temperature can not be distinguished or isolated at a measurable level above that induced by clearly observed and understood, natural effects, such as the partially cyclical, redistribution of surface energy in the El Nino.  Variations in solar energy, exotic charged particles in the solar wind, cosmic ray fluxes, orbital and rotational characteristics of the planet’s motion together provide a rich combination of electrical and mechanical forces which disturb the atmosphere individually and in combination.  Of course, that doesn’t mean that carbon dioxide is not a “greenhouse gas”, so defined as one which absorbs radiation in the infra red region of the spectrum. However, the “human signal”, the effect of the relatively small additional gas that human activity provides annually to the atmosphere, is completely lost, being far below the level of noise produced by natural climate variation.

So how do our IPCC scientists deal with this? Do they revise the theory to suit the experimental result, for example by reducing the climate sensitivity assumed in their GCMs? Do they carry out different experiments (i.e., collect new and different datasets) which might give more or better information? Do they go back to basics in preparing a new model altogether, or considering statistical models more carefully? Do they look at possible solar influences instead of carbon dioxide? Do they allow the likelihood that papers by persons like Svensmark, Spencer, Lindzen, Soon, Shaviv, Scafetta and McLean (to name just a few of the well-credentialed scientists who are currently searching for alternatives to the moribund IPCC global warming hypothesis) might be providing new insights into the causes of contemporary climate change?

Of course not. That would be silly. For there is a scientific consensus about the matter, and that should be that.

Posted on 04/12 at 05:52 PM
(0) TrackbacksPermalink


Page 1 of 1 pages
Blogroll