The right strategy wins the war Gifts, gadgets, weather stations, software and here!\
The Blogosphere
Tuesday, January 02, 2018
TV Science Guy: Economic Sanctions Coming for Climate Deniers

Mad Patriots

According to television’s Bill Nye, the Science Guy, conservatives in Republican-controlled states should “watch out” because Democrats in other parts of the country weren’t going to stand by idly and watch as the federal government ignored climate change. In an interview with MSNBC on Friday, Nye - who is not a scientist of any kind, much less a climate expert - said that blue states were ready to take it upon themselves to address global warming. He said that may even extend to punishing states that did not take measures to clamp down on the fossil fuel industry.

“Only 40 percent of people in the U.S. think that Congress should be addressing this and that’s because certain conservative groups, especially from the fossil fuel industry, have been very successful in introducing the idea that scientific uncertainty, plus or minus two percent, is the same as plus or minus 100 percent.” Nye said.

Nye, whose Netflix show is a wasteland of liberal insanity, went on to say that climate change issues were a matter of constitutional independence from government.

“There’s a lot of emphasis from conservatives on what are writ-large states rights,” he said. “Just watch out, conservatives, if states rights include California, Illinois, New York - these places that, where people voted in a progressive fashion - watch out if all those places start to address climate change and then impose economic sanctions, either overtly or by default, on places that have not embraced the work that needs to be done. Then you’ll end up with this states rights working the other way.”

Uh-huh. Well, yes, each state certainly has the right to develop their own economic practices and no one is denying that to be the case. But if states like California and New York are going to penalize the use and manufacture of oil and gasoline, it won’t be conservative states paying the price by any means. Indeed, this appears to have all the makings of state-imposed economic suicide.

Nothing new for Democrat-run fiefdoms, though…

It is fascinating, though, to watch these Democrats - still SO ANGRY that they lost the 2016 election - increasingly support independence from the federal government. But you’ll note that they only support that independence insofar as they want to be free from Washington’s rules. They DON’T want to actually go the full mile and pay any economic consequence for their independence. When it comes to illegal immigration, climate change, and other areas of disagreement with the Trump administration, they want to have their cake and eat it too. Flaunt the federal government’s laws while accepting millions and millions of dollars in federal funds.

With rights come responsibility. If liberal America wants to declare its independence, let them take on the financial responsibility of doing so.


Bill Nye had a degree in Engineering. He i not a scientist. In fact his first job was as a stand-up comic. He then thought he could use his public exposure experience to become the modern day Mr. Wizard. 

The original Mr. Wizard, Don Herbert, was the creator and host of Watch Mr. Wizard (1951-65, 1971-72) and of Mr. Wizard’s World (1983-90), which were educational television programs for children devoted to science and technology. As Patrick kelley wrote in 2013 in the Emporia Gazette “Herbert’s approach to science was a welcome relief from the worshipful awe of scientists that followed the creation of the atomic bomb at the end of World War II. The public saw scientists as white-coated super-intellects who kept their cool gazes focused above the heads of the rest of humanity on a remote horizon that no one else could see. The atomic era seemed to widen the gap between scientists and the rest of humanity. Then Herbert showed up, friendly, accessible and smart, and showed children how the principles of science worked for everyone, not just for people with Ph.D.’s and lab coats.

Nye is an ideologue with disdain for anyone who does not share his political persuasion and who lets his ideology drive his position on the science unlike the true scientists like Feynman who was a champion of the scientific method. Bill is an embarrassment to real scientists.  Envirofacists are taking matters into their own hands attempting to and actually committing eco-terrorism - see .

Oh by the way Bill
Dr.Roy Spencer reports:

This morning at 7 a.m. EST, Monday the area average temperature across the contiguous 48 states was a frigid 11 deg. F. It was 13F Tuesday.

Here’s the high-resolution surface temperature analysis from NCEP, graphic courtesy of

Surface temperature analysis at 7 a.m. EST January 1, 2018.

Over 85% of the nation is below freezing, and nearly 1/3 is below 0 deg. F. The forecast is for cold air to continue to flow down out of Canada into the central and eastern U.S. for most of the coming week.

Posted on 01/02 at 09:19 AM
(1) TrackbacksPermalink

Monday, December 18, 2017
Scientific peer review: an ineffective and unworthy institution

By Paul Homewood

h/t stewgreen

The Times Higher Education Supplement’s blog carries a damning indictment of peer review, which has very real relevance to the climate change debate:

Given the entirely appropriate degree of respect that science has for data, the ongoing discussion of peer review is often surprisingly data-free and underlain by the implicit assumption that peer review - although in need of improvement - is indispensable.

The thing is, the peer review of scientific reports is not only without documented value in advancing the scientific enterprise but, in a manner that few care to acknowledge openly, primarily serves ends that are less than noble. Peer review is widely assumed to provide an imprimatur of scientific quality (and significance) for a publication, but this is clearly not the case.

While the many flaws of peer review are clearly laid out in the literature, its failure to protect the integrity of the scientific enterprise is notable. An estimated cost of irreproducible biomedical research is $28 billion (20 billion pounds) a year and ‘currently, many published research findings are false or exaggerated, and an estimated 85 per cent of research resources are wasted’, one paper found.

A prime example of the failure of peer review is the tainting of a significant segment of the biomedical literature by the use of misidentified and contaminated cell lines pointing, at best, to a culture of carelessness in cell biology research and the clear failure of peer review to discover and correct erroneous research.

There are many reasons why scientific peer review is ineffective. An important factor is the inadequacy of almost all scientific reporting; publications should contain sufficient information that all aspects of the work can be understood, permitting a published result to be reproduced from the original data, as well as independent replication of the study by others wishing to do this.

If these minimal standards are not met then critical information is missing and the reader has no way of assessing if the published research is correct or false in its claims and conclusions - even exact replication of a study is precluded. Reproducibility and repeatability require that all theory, methods, equipment, reagents, source code, computational environment, raw data and analytical and statistical methods be fully documented and openly available.

This standard is not enforced by peer review as currently practiced, with the result that most publications in most journals should be viewed by the skeptical reader as little better than advertisements that present the authors’ claim to priority but preclude straightforward and independent verification.

We are not the first to identify these problems, so we might ask why peer review retains its essentially unassailable status. We suggest a two-fold answer rooted more in socio-economic factors than the dispassionate review of scientific research.

First, peer review is self-evidently useful in protecting established paradigms and disadvantaging challenges to entrenched scientific authority. Second, peer review, by controlling access to publication in the most prestigious journals helps to maintain the clearly recognized hierarchies of journals, of researchers, and of universities and research institutes. Peer reviewers should be experts in their field and will therefore have allegiances to leaders in their field and to their shared scientific consensus; conversely, there will be a natural hostility to challenges to the consensus, and peer reviewers have substantial power of influence (extending virtually to censorship) over publication in elite (and even not-so-elite) journals.

Publication in the highest-profile journals reinforces the hierarchies of status in the scientific community and promotes very effectively the prestige-, career- and profit-driven motives of authors, journal editors, publishers and (less directly) universities. This state of affairs exerts a particularly baleful influence on interdisciplinary research.

Innovations in peer review (including dispensing with its traditional forms) are to be encouraged. It may be that open publication through servers such as arXiv and bioRxiv, along with public and signed post-publication comment, are the solution to the problems noted above.

However, for any innovations in scientific publication to succeed two conditions would need to be met. The first, as noted above, is the provision with a publication of all the information necessary for independent reproduction and repeatability of the research, and the second is the improvement in the culture of science such that less than rigorous work and deceptive publication practices are no longer tolerated.

With the scientific method itself at risk, the stakes could not be higher.

Les Hatton is a mathematician and emeritus professor at Kingston University. Gregory Warr is a biochemist and emeritus professor at the Medical University of South Carolina. Both have extensive experience as peer reviewers and journal editors.

See more here and here.

Posted on 12/18 at 05:36 PM
(1) TrackbacksPermalink

Sunday, December 17, 2017
Global Warming: Fake News from the Start

By Dr. Tim Ball and Tom Harris

President Donald Trump announced the U.S. withdrawal from the Paris Agreement on climate change because it is a bad deal for America. He could have made the decision simply because the science is false, but most of the public have been brainwashed into believing it is correct and wouldn’t understand the reason.

Canadian Prime Minister Justin Trudeau, and indeed the leaders of many western democracies, though thankfully not the U.S., support the Agreement and are completely unaware of the gross deficiencies in the science. If they did, they wouldn’t be forcing a carbon dioxide (CO2) tax, on their citizens.

Trudeau and other leaders show how little they know, or how little they assume the public know, by calling it a ‘carbon tax.’ But CO2 is a gas, while carbon is a solid. By calling the gas carbon, Trudeau and others encourage people to think of it as something ‘dirty’, like graphite or soot, which really are carbon. Calling CO2 by its proper name would help the public remember that it is actually an invisible, odorless gas essential to plant photosynthesis.

Canadian Environment Minister Catherine McKenna is arguably the most misinformed of the lot, saying in a recent interview, for example, that “Polluters should pay.” She apparently does not know that CO2 is not a pollutant.

And, like many of her political peers, McKenna dismisses credentialed PhD scientists who disagree with her government’s approach, labelling them “deniers.” She does not seem to understand that questioning scientific hypotheses, even scientific theories, is what all scientists should do. That is why the official motto of the Royal Society is “Nullius in verba,” Latin for “Take nobody’s word for it.” Ironically, the Society rarely practices this approach when it comes to climate change.

Mistakes such as those made by McKenna are not surprising considering that the entire claim of anthropogenic global warming (AGW) was built on falsehoods and spread with fake news.

The plot to deceive the world about human-caused global warming gathered momentum following creation of the United Nations Intergovernmental Panel on Climate Change (IPCC) in 1988 by the World Meteorological Organization and the United Nations Environment Program (UNEP). After spending five days at the U.N. with Maurice Strong, the first executive director of UNEP, Hamilton Spectator investigative reporter Elaine Dewar concluded the overarching objective of the IPCC was political. “Strong was using the U.N. as a platform to sell a global environment crisis and the global governance agenda,” wrote Dewar.

The political agenda required ‘credibility’ to achieve the deception. It also required some fake news for momentum. Ideally, this would involve testimony from a scientist before a legislative committee.

U.S. Senator Timothy Wirth (D-CO) was fully committed to the political agenda and the deception as he explained in a 1993 comment, “We’ve got to ride the global warming issue. Even if the theory of global warming is wrong, we will be doing the right thing...”

In 1988 Wirth was in a position to jump start the climate alarm. He worked with colleagues on the Senate Energy and Natural Resources Committee to organize a June 23, 1988 hearing where Dr. James Hansen, then the head of the Goddard Institute for Space Studies (GISS), was to testify. Wirth explained in a 2007 interview with PBS Frontline:

“We knew there was this scientist at NASA, you know, who had really identified the human impact before anybody else had done so and was very certain about it. So, we called him up and asked him if he would testify.”

Hansen did not disappoint. The New York Times reported on June 23, 1988:

“Today Dr. James E. Hansen of the National Aeronautics and Space Administration told a Congressional committee that it was 99 percent certain that the warming trend was not a natural variation but was caused by a buildup of carbon dioxide and other artificial gases in the atmosphere.”

Specifically, Hansen told the committee,

“Global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and observed warming...It is already happening now”.

Hansen also testified:

“The greenhouse effect has been detected and it is changing our climate now...We already reached the point where the greenhouse effect is important.”

Dr. John S. Theon, Hansen’s former supervisor at NASA, wrote to the Senate Minority Office at the Environment and Public Works Committee on January 15, 2009. “Hansen was never muzzled even though he violated NASA’s official agency position on climate forecasting (i.e., we did not know enough to forecast climate change or mankind’s effect on it). Hansen thus embarrassed NASA by coming out with his claims of global warming in 1988 in his testimony before Congress.”

Hansen never abandoned his single-minded, unsubstantiated claim that CO2 from human activities caused dangerous global warming. He defied the Hatch Act that limits bureaucratic political actions, and, in 2011, was even arrested in a protest at the White House against the Keystone XL pipeline, at least his third such arrest to that point.

Wirth, who presided at the hearing, was pre-disposed to believe Hansen and told the committee:

‘“As I read it, the scientific evidence is compelling: the global climate is changing as the earth’s atmosphere gets warmer. Now, the Congress must begin to consider how we are going to slow or halt that warming trend and how we are going to cope with the changes that may already be inevitable.”

So, like Trudeau and other leaders duped by the climate scare, Wirth has either not read or not understood the science. In fact, an increasing number of climate scientists (including Dr. Ball) now conclude that there is no empirical evidence of human-caused global warming; there are only computer model speculations that humans are causing it and every forecast made using these models since 1990 has been wrong.

More than any other event, that single hearing before the Energy and Natural Resources Committee publicly initiated the climate scare, the biggest deception in history. It created an unholy alliance between a bureaucrat and a politician, that was bolstered by the U.N. and the popular press leading to the hoax being accepted in governments, industry boardrooms, schools, and churches across the world.

Trump must now end America’s participation in the fake science and the fake news of man-made global warming. To do this, he must withdraw the U.S. from further involvement with all U.N. global warming programs, especially the IPCC as well as the agency that now directs it - the United Nations Framework Convention on Climate Change. Only then will the U.S. have a chance to fully develop its hydrocarbon resources to achieve the president’s goal of global energy dominance.

Tom Harris, B. Eng., M. Eng. (Mech.)
Executive Director
International Climate Science Coalition (ICSC)
28 Tiverton Drive
Ottawa, Ontario K2E 6L5

Posted on 12/17 at 03:53 PM
(0) TrackbacksPermalink

Wednesday, November 22, 2017
Rebutting the claim that snowfall and snowcover is diminishing as the earth warms

By Joseph D’Aleo, CCM, AMS Fellow

See cable hour on hurricanes and snow here.

This is a brief rebuttal to claims made in the CSSR NCA released this year. We are preparing 21 such rebuttals to some bad science. Many organizations that are climate realists are doing this. This is done pro bono. We would appreciate your help as we at Icecap work with some of the nation’s top scientists. Our DONATE button is on the left. Small donations are welcome. We have to cover our maintenance fees for the site before we begin to cover our expenses. BTW we have done another cable show for HCTV reviewing this hurricane season and talking about winters including this one beginning. We will post the link and video when it is ready to go next week. By the way, the BHO posted our winter outlook on their website.

We thank you on this Thanksgiving weekend when we all give thanks for all we have, all the special people in our lives. Each year we recommit ourselves to make us worthy of your appreciation. We have surpassed 96 million page hits.



Snowfall and snowcover is diminishing as the earth warms

This is one claim that has been repeated for decades even as nature shows very much the opposite trend with unprecedented snows even to the big coastal cities. Every time they repeat the claim, it seems nature ups the ante more.

They have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the events in recent years have usually occurred in colder winters with high snow water equivalent ratios in dry arctic air.

Snowcover in the Northern Hemisphere, North America and Eurasia has been increasing since the 1960s in the fall and winter but declining in the spring and summer. Methodology changes at the turn of this century may be responsible for part of the warm season differences.

Claims and a reality check

On March 20, 2000, the UK Independent, reported that “Snowfalls are just a thing of the past”. They quoted David Viner of the Climatic Research Unit (CRU) of the University of East Anglia “Global warming was simply making the UK too warm for heavy snowfalls. Within a few years winter snowfall will become a very rare and exciting event”. “Children just aren’t going to know what snow is,” he said.

Similarly, David Parker, at the UK’s Hadley Centre for Climate Prediction and Research, said that eventually British children could have only “virtual” experience of snow via movies and the Internet.

The Union of Concerned Scientists said in 2006 Scientists claim winters becoming warmer and less snowy. They published the results of the study on its climatechoices website: “Across the globe, and here in the Northeast, the climate is changing. Records show that spring is arriving earlier, summers are growing hotter, and winters are becoming warmer and less snowy. These changes are consistent with global warming, an urgent phenomenon driven by heat-trapping emissions from human activities”

The IPCC and US government reports through 2007 had projected snows would become much less common as the climate warms especially in the cities. The UCS had a media workshop in the late summer of 2007 on Mt. Washington promising a dire future for the winter sports and maple sugar industries That next winter all time seasonal snow records were set for snowfall in the northeast from Concord to Caribou (and all through the western US up to Alaska).

The Technical Support Document for the EPA in 2009 (page 29) stated: “Rising temperatures have generally resulted in rain rather than snow in locations and seasons where climatological average (1961-1990) temperatures were close to 0C.”

The latest CSSR NCA had as a key finding:"There has been a trend toward earlier snowmelt and a decrease in snowstorm frequency on the southern margins of climatologically snowy areas (medium confidence). Winter storm tracks have shifted northward since 1950 over the Northern Hemisphere (medium confidence).”

Princeton Environmentalist Michael Oppenheimer and RFK Jr, in 2008 both bemoaned their children in the DC area would never get to enjoy sledding like they did as young in the 1960s.

That very next winter, the DC area and the entire Mid-Atlantic had record snowfall in what was called Smowmaggedon.


Starting in 2008, the UK and much of Europe and the Northern Hemisphere began receiving snow and cold at levels not seen since the days of Charles Dickens in the early 1800s. December 2010 was the second coldest December in the Central England temperature data back to 1659.


In the United States, the winter of 2013/14 was the coldest and snowiest ever in places since the 1800s in the Great Lakes. Along the east coast we have seen record setting snow years and 24 major impact snowstorms for the 10-year period ending 201/17. No other decade had had more the 10.


But the media continues to be oblivious to the real data as the hype machine continues to sell its story despite the failure. The New York Times had an article in February, 2014 titled ‘The end of snow.’ The article documented how snow soon was going to be a distant memory and our kids would never see it except in news reels.

The UCS from the UNH in January 2015 after the winter got off a slow snow start repeated their annual warning about the climate change induced death of the ski and maple sugar industries. But then 2014/15 set records for snowfall in Boston (back to 1872) and many other locations in the northeast into southeastern Canada. In 6 weeks over 100 inches of snow fell in the Boston area with 110.6 inches for the winter. It was also the coldest January to March period and in much of the northeast, February 2015 was the coldest month in history.

The recent snowy winters has the 10 year running mean snowfall in Boston at the highest in the entire record back to the 1880s.


It appears for the eastern areas, the 2014/15 winters snowblitz may have delivered the most snow since perhaps 1717.  That year, the snow was so deep that people could only leave their houses from the second floor, implying actual snow depths of as much as 8 feet or more.


The driver for the 2014/15 winter was the same as in the frigid winters of 1916/17, 1917/18, 1976/77 and 1977/78, 1993/94 and 2002/03 - a pool of warm water in the northeast Pacific extending south along the entire west coast.


Alarmists claimed increased snow is consistent with global warming because warmer air holds more moisture. In actual fact, only 1 of the 14 years with more than 60 inches of snow in Boston was warmer than normal.


Snow is favored in COLD winters and increases with cooling not warming. In the 39 days in the heart of 2014/15 winter when Boston had 100.2 inches of snow, the melted precipitation was 5.69 inches, a ratio of 17.6 to 1. Typical snow to melted precipitation ration is 10 to 12 to 1. The big snows in recent years have come with unusually cold temperatures. Seasonal snows are high in cold winters, low in warmer winters.

They also claimed that snowcover though is shrinking especially in the spring and summer. They neglect to mention the snowcover is increasing in the fall and winter.



The spring snow extent is diminishing. But some of this may be due to the change in the methodology for snow measurement - from a manual evaluation to an automated albedo (reflectivity) method which in 2000 had NOAA provide a user beware warning not to compare snow cover in the warmer months to values before 2000.


Posted on 11/22 at 11:00 PM
(3) TrackbacksPermalink

Monday, November 06, 2017
Early rebuttals of the scientifically flawed NCA

As one might expect from a UN inspired report, especially one with such a weak team of Lead Authors, this one will set a new record for these reports for bad science. the key findings often do not reflect the material continued within much as was the case with the UN reports and prior government sponsored reports.

There are a few early responses.

2016 National Climate Assessment, a Self Falsifying Prophecy

Guest post by David Middleton on WUWT

There has been some recent “buzz” about the upcoming Fourth National Climate Assessment (NC4), including some moonbat conspiracy theories that the Trump administration will try to suppress or otherwise interfere with the scientific integrity of the report.  The New York Times has already been forced to essentially retract such a claim in a recent article.

If NC4 actually builds upon 2014’s NC3, EPA Administrator Pruitt’s Red Team will have even more material to work with.

Fourth National Climate Assessment

Development of the Fourth National Climate Assessment (NCA4) is currently underway, with anticipated delivery in late 2018. Below you will find information related to NCA4, including a list of chapters, explanation of author roles, and opportunities to participate in the process.

What’s New in NCA4

NCA4 will build upon the successes of the Third National Climate Assessment. Find out more:


“NCA4 will build upon the successes of the Third National Climate Assessment"… What success?

Here’s a link to the NCA3 overview.

The first “sciencey” graphic is titled:  Projected Global Temperature Change.


And then just enlarged the Epic Failure bits to get the Red Team’s QED:



Then there is Tony Heller’s attack on the temperature claims.

Very High Confidence Of Fraud In The National Climate Assessment

Katharine Hayhoe and her partners in crime have officially released their National Climate Assessment, which includes this graph, which claims “Record Warm Daily Temperatures Are Occurring More Often”.


The first thing I noticed is that the text in the report does not match the graph. They say :

The Dust Bowl era of the 1930s remains the peak period for extreme heat in the United States

Yet the graph right below it does not show the 1930’s as being hot.


Unfortunately for Katharine and her band of climate fraudsters, I have software which does this calculation. The graph below is the correct version. Their graph is more or less correct after 1970 - but the pre-1970 data is completely fraudulent. They removed all of the hot weather from 1930 to 1954.  NOAA does not make adjustments to daily temperatures, so they can’t use that excuse.


The report claims :

Record Warm Daily Temperatures Are Occurring More Often

That is an outright lie.  Record warm daily maximum temperatures have decreased sharply since 1930 - the start date of their graph.


The number of record daily minimums has also decreased.


The US climate is getting milder, with fewer very hot or very cold days.


So why the big spikes in 2012 and 2016?


This is a classic divide by zero error. Ratios become unstable when the denominator becomes small. The numbers are meaningless. No serious scientist would release a wildly flawed and dishonest graph like Katharine Hayhoe does on a consistent basis.

See Tonys video:

See one more detailed assessment by Judith Curry.

Posted on 11/06 at 08:21 PM
(1) TrackbacksPermalink

Monday, October 09, 2017
Misuse of the scientific method has led to peer review failures with significant implications

By Joseph D’Aleo, CCM, AMS Fellow

See this excellent video from Tony Heller challenging the climate mafia and their continuing adjustment of data. No scientific method is being applied to the work of Government agencies and their cohorts like RSS.


The scientific method in science is a well established iterative process. The scientific method starts with a theory or hypothesis. The data needed to test it and all possible factors involved are identified and gathered. The data is processed and the results rigorously tested. The data and methods are made available for independent replication. Reviewers for the proposed theory must have the requisite skills in the topic and in the proper statistical analysis of the data to judge its validity. If it passes the tests and replication efforts, a conclusion is made and the theory may be turned into a paper for publication. If it fails the tests, the hypothesis or theory must be rethought or modified.

Astronomer Carl Sagan, Professor and Director of Cornell University’s Laboratory for Planetary Studies and host of the series Cosmos a Personal Voyage in a 1995 book The Demon-Haunted World: Science as a Candle in the Dark explained the scientific method and encouraged critical and skeptical thinking. He emphasized the importance of recognizing the difference between what is considered valid science and which is in reality pseudoscience.

Sagan like fellow Cornell physicist/lecturer Richard Feynman argued when new ideas are offered for consideration, they should be tested by means of skeptical thinking and should stand up to rigorous questioning. Feynman lectured;

“If a theory or proposed law disagrees with experiment (or observation), it’s wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, it doesn’t matter how smart you are who made the guess, or what your name is.... If it disagrees with experiment, it’s wrong. That’s all there is to it.”

Sir Karl Popper, an Austrian-British philosopher and professor is generally regarded as one of the greatest philosophers of science of the 20th century. Popper is known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: A theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments.


It should noted a refutation of a previously accepted theory even one that has been published and widely accepted can follow the same route to review and publication as Albert Einstein observed:


The peer review process is failing due to political and economic pressures that have altered the scientific method to virtually ensure a politically correct or economically fruitful theory can never fail.

When the tests fail, instead of rethinking the theory or including other factors, there is an alarming tendency to modify input data to more closely fit the theory or models.


Also often, the authors and reviewers do not to have the proper understanding of all the factors involved and often the needed mathematical skills to properly evaluate the results.  And even if they do, the input data and methods are generally not made available to the reviewers for replication.  And in many cases, forecasts are made for many decades or even centuries into the future, so true validation is not possible, a luxury those of us who must forecast in shorter time frames (days to seasons) do not enjoy.

Also too often, the reviewers that then serve as final gatekeepers are often not only not fully capable of this kind of rigorous review, they are often biased and speed politically correct or economically beneficial work to publication while blocking or at least ‘slow walking’ work that challenges the so-called consensus science or their own often ideologically driven beliefs.

As Dr. Michael Crichton wrote “Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. In science, consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus. (Galileo, Newton, Einstein, etc)”.


So when greenhouse climate models fail, they don’t revisit the theory but instead try and find the right data to fit that model. All data today is adjusted with models with a goal of addressing data errors, changes in location or instrumentation or addressing changing distribution or filling in for missing data or station closures. Once you start this adjustment process, it becomes increasingly possible to the find ways to mine from the data the desired results.

With the climate models there is an increasingly large divergence with balloon, satellite and surface reanalysis data sets the last 20 years. The one model that follows the temperature is a Russian model that has roughly half the greenhouse forcing and improved ocean modeling.


John Christy 2017 has shown models without greenhouse warming agreed perfectly with atmospheric (tropical) observations.


This kind of refutation should, if scientists abided by the scientific method, spark an effort to revisit the theory but that is too politically incorrect. This kind of ideologically or politically or economically driven thinking is pervasive across the sciences (atmospheric and medical).


There is increasing proof that the traditional journal peer review process is broken. This is true in the Medical and Scientific areas. 

See this example of one such falsified report that the author worries is a part of an epidemic of agenda-driven science by press release and falsification that has reached crisis proportions.

Other reports show an alarming number of papers having to be retracted.  Springer is retracting 107 papers from one journal after discovering they had been accepted with fake peer reviews (here).

Result-oriented corruption of peer review in climate science was proven by the Climategate emails.

In the journals, there are a small set of gatekeepers that block anything that goes against the editorial biases of the journals. Conversely, these journals and their reviewers do not provide a thorough due diligence review of those that they tend to agree with ideologically.  They are engaged in orthodoxy enforcement.

In an essay ”Has Science Lost its Way?”, Michael Guillen Ph.D wrote about Science’s reproducibility crisis.

For any study to have legitimacy, it must be replicated, yet only half of medical studies celebrated in newspapers hold water under serious follow-up scrutiny - and about two-thirds of the “sexiest” cutting-edge reports, including the discovery of new genes linked to obesity or mental illness, are later “disconfirmed.” Though erring is a key part of the scientific process, this level of failure slows scientific progress, wastes time and resources and costs taxpayers excesses of $28 billion a year, writes NPR science correspondent Richard Harris.

The single greatest threat to science right now comes from within its own ranks. Last year Nature, the prestigious international science journal, published a study revealing that “More than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments.”

The inability to confirm research that was published in highly respected, peer-reviewed journals suggests something is very wrong with how science is being done.
The crisis afflicts even science’s most revered ‘facts,’ as cancer researchers C. G. Begley and Lee Ellis discovered. Over an entire decade they put fifty-three published “landmark” studies to the test; they succeeded in replicating only six - that’s an 11% success rate.

A major culprit, they discovered, is that many researchers cherry-picked the results of their experiments - subconsciously or intentionally - to give the appearance of success, thereby increasing their chances of being published.

“They presented specific experiments that supported their underlying hypothesis, but that were not reflective of the entire data set,” report Begley and Ellis, adding this shocking truth: There are no guidelines that require all data sets to be reported in a paper; often, original data are removed during the peer review and publication process’

Another apparent culprit is that - and it’s going to surprise most of you - too many scientists are actually never taught the scientific method. As graduate students, they take oodles of courses in their chosen specialty; but their thesis advisors never sit them down and indoctrinate them on best practices. Consequently, remarks University of Wisconsin-Madison biologist Judith Kimble: “They will go off and make it worse.”

This observation seems borne out by the Nature study, whose respondents said the three top weaknesses behind science’s reproducibility crisis are: 1) selective reporting, 2) pressure to publish, and 3) low statistical power or poor analysis. In other words, scientists need to improve on practicing what they preach, which is: 1) a respect for facts - all of them, not just the ones they like, 2) integrity, and 3) a sound scientific method.

The attendees of the so-called ‘Earth Day’ March for Science made a lot of noise about wanting more money and respect from the public and government - what group wouldn’t want that? But nary a whisper was heard from them or the media about science’s urgent reproducibility crisis. Leaving unspoken this elephant-sized question: If we aren’t able to trust the published results of science, then what right does it have to demand more money and respect, before making noticeable strides toward better reproducibility?

Michael Guillen Ph.D., former Science Editor for ABC News, taught physics at Harvard and author of “The Null Prophecy”.


Although well received and widely distributed, our recent press release and research paper hit a raw nerve with alarmists. The research sought to validate the current estimates of Global Average Surface Temperatures (GAST) using the best available relevant data. The conclusive findings were that the three GAST data sets are not a valid representation of reality. In fact, the magnitude of their historical data adjustments, which removed their cyclical temperature patterns, is totally inconsistent with published and credible U.S. and other temperature data.

Thus, despite current claims of record setting warming, it is impossible to conclude from the NOAA, NASA and UK Hadley CRU GAST data sets that recent years have been the warmest ever.

Finally, since GAST data set validity is a necessary condition for EPA’s CO2 Endangerment Finding, it too is invalidated by these research findings. This means that EPA’s 2009 claim that CO2 is a pollutant has been decisively invalidated by this research.

We had shown in prior research reports here and here how even if you ignore the adjustments, the changes observed can be explained entirely by natural factors (ocean cycles, solar cycles and volcanism). If one considers the urban heat island contamination of surface date, the idea that temperatures may actually be declining since the 1930s in cyclical fashion, very much in line with record highs.

The media fact checkers, which serve often as enforcers of orthodoxy, could not meaningfully question the data or science presented but challenged the claim that it was ‘peer reviewed’ (in the sense the peer review process has been defined today by the ‘advocacy’ journals’ (really ‘pal review’wink.

Our research reports were rigorously peer reviewed by top scientists. The reports follow the approach long used in industry often for their own internal use. The reports were prepared by author teams with the requisite skills at proper data collection, a deep understanding of the scientific factors involved and statistical skills to evaluate what best explains the observed changes.

To abide by the scientific method, the work must be capable of being replicated. Our highly qualified reviewers who endorsed it are capable of evaluating the work scientifically and or statistically. They approval includes a willingness, even eagerness to endorse the work. The data and the methodology is available for others to replicate.

Our approach follows the long accepted application of the scientific method in a world where science is too politicized.

Posted on 10/09 at 07:30 AM
(1) TrackbacksPermalink

Monday, October 02, 2017
Chief science adviser attacks academic ‘arrogance’ on policy

Times Higher Education

The chief science adviser to the prime minister of New Zealand has accused scientists of displaying “hubris” and “arrogance” when they comment on government policy.


Sir Peter Gluckman, who also chairs the International Network for Science Advice to Governments, leveled a series of sharp criticisms at researchers and science organizations during an event in Brussels that debated the role of policy and evidence in a “post-fact” world.

Working on an allotment

University rankings ‘must give greater recognition’ to engagement

He argued that scientists needed to appreciate that politicians made their decisions based on values as well as scientific evidence.

‘Individual scientists, professional and scientific organizations too often exhibit hubris in reflecting on policy implications of science,” Sir Peter told delegates at “EU for facts: evidence for policy in a post-fact world”, held on 26 September.

“This arrogance can become the biggest enemy of science effectively engaging with policy - the policy decisions inevitably involve dimensions beyond science.”

Scientists needed to appreciate that political ideology, financial and diplomatic constraints, and “electoral contracts’ also had to be taken into account by politicians, Sir Peter said. “It is important that [scientific] knowledge is provided [to policymakers] in a way that does not usurp the ability of policy process to consider these broader dimensions: otherwise trust in advice can be lost as it becomes perceived as advocacy,” he argued.

He also said that he avoided using the “somewhat arrogant” term “evidence-based policy”, preferring “evidence-informed” instead. Meanwhile, “too often academy reports are focused on academic demonstration rather than meeting policy needs or answering an unasked question”, he added.

Similar warnings have come from other figures in science. Last year, Jeremy Berg, the editor-in-chief of Science, said that academics have too often ventured into giving policy prescriptions rather than just explaining the evidence, for example in the area of climate change.

Although he named no names, Sir Peter also warned that “individual scientists” were now using their “scientific standing” to make claims “well beyond the evidence and their expertise”. Universities may also “over-hype” their science, he added.

In addition, the pressures of “performance measurement, bibliometrics, and the quest for societal and industrial impact” also have the potential to undermine public trust in science, he said, “due to perceived or actual conflicts of interest and the potential to affect the behaviour of individual scientists”.

At the same conference, Carlos Moedas, European commissioner for research, science and innovation, argued that to combat a “crisis of confidence” in science, there needed to be online “places of trust for scientific advice”, just as sites like Mayo Clinic or WebMD were trusted sources of medical advice.

Such sites would be “where citizens know that science is genuine. Where the process is explained. Where they can check the sources. Where they can access the data themselves,” he said.

“So I believe in the future there will be two types of internet. The one you trust and the one you don’t,” he added.


Prager University has excellent short videos on the topic:

Posted on 10/02 at 03:33 PM
(1) TrackbacksPermalink

Sunday, September 24, 2017
New York Times and Arctic Ice

The New York Times did a story on arctic ice this past week.

See in this video how Tony Heller responds to this story

See in this Icecap story how the changes are cyclical as Tony shows with data and news accounts including the New York Times.

We showed how the water from the Atlantic and Pacific enter the arctic underneath the floating ice. When the Atlantic and Pacific are in their warm modes, this leads to thinning ice and reduced summer coverage.


The UAF IARC showed how warm water in the Atlantic warms the arctic and reduces ice.


We showed how the Atlantic and Pacific combined warmth and coolness corresponds with arctic temperatures.


In the record-setting (since satellite monitoring began in 1979) summer melt season of 2007, NSIDC scientists noted the importance of both oceans in the arctic ice.

“One prominent researcher, Igor Polyakov at the University of Fairbanks, Alaska, points out that pulses of unusually warm water have been entering the Arctic Ocean from the Atlantic, which several years later are seen in the ocean north of Siberia. These pulses of water are helping to heat the upper Arctic Ocean, contributing to summer ice melt and helping to reduce winter ice growth.

Another scientist, Koji Shimada of the Japan Agency for Marine-Earth Science and Technology, reports evidence of changes in ocean circulation in the Pacific side of the Arctic Ocean. Through a complex interaction with declining sea ice, warm water entering the Arctic Ocean through Bering Strait in summer is being shunted from the Alaskan coast into the Arctic Ocean, where it fosters further ice loss.

Many questions still remain to be answered, but these changes in ocean circulation may be important keys for understanding the observed loss of Arctic sea ice.”

Dr. Willie Soon shows a much better correlation of solar TSI and arctic temperatures than CO2.


Posted on 09/24 at 01:49 PM
(2) TrackbacksPermalink

Monday, September 18, 2017
Don’t believe the hurricane hype machine

By Michael Sununu, Union Leader

LET US SET hyperbole aside. Climate change is NOT making hurricanes stronger or more frequent.

Statements to the contrary by climate scientists like Michael Mann and Tom Peterson are not based on objective analysis of historical data. Most of the claims by the alarmists are based on hypothetical model projections, an ill-informed understanding of weather, and a desire to ignore the historical precedents which indicate the current environment is no different than the past.

Alarmists claim flooding caused by Harvey was the worst ever. In fact, those areas of coastal Texas have seen similar flooding in the past. In 1978, tropical storm Amelia dumped 48 inches of rain on Texas, and the following year, Claudette inundated the state with 54 inches, with one location in Alvin, Texas, receiving 43 inches in a 24-hour period (a record).

These two storms occurred during the late 1970s’ cooler climate. In 1954, Hurricane Alice dropped huge amounts of rain in the Rio Grande Valley. In 1935, Harris County saw its worst floods, with waters reaching the second and third stories in Houston. Harvey was not unprecedented.

The forces behind these huge rain totals are local weather patterns that stall storm systems over the state. WeatherBell’s chief forecaster, Joe Bastardi, pointed out that a major trough extended south and trapped Harvey just onshore. This allowed the storm to suck up warm moisture from the Gulf and continue to drop it in the same area over several days. Thirty- to forty-inch rain totals are not a common occurrence, but they do occur every decade or so in Texas.

The second claim is that the U.S. is seeing stronger hurricanes than in the past, because climate change is raising hurricane intensity. Nothing could be further from the truth. Looking at the historical records, Roger Pielke Jr. has noted that in the 44-year period between 1926 and 1969, fourteen Category 4 or 5 hurricanes made landfall in the United States. Over the next 47 years, between 1970 and 2017, just four hit the United States, and that includes Harvey this year. We are seeing fewer major hurricanes hit the U.S., not more.

Others have noted that Irma was a very intense hurricane reaching very low pressures, and this is true. Irma reached as low as 914mb and wind speeds of 185 mph. Hurricane Wilma in 2005 holds the record with the lowest pressure in the Atlantic basin at 882mb. When you look at historical storms, Wilma was comparable to the Great 1935 Labor Day hurricane that made landfall with pressure at 892mb, the only storm to make landfall below 900mb. Since there were no Hurricane Hunter aircraft in that era, it is quite possible that the 1935 storm had pressures even lower than Wilma.

There is evidence that the Great Hurricane of 1780, which destroyed much of the eastern Caribbean, may have had windspeeds in excess of 200 mph. It destroyed every house and fort on Barbados, stripped the bark off trees, and killed over 22,000 people in the region.

Clearly, Mother Nature has always been very capable of generating storms far more intense than what we have seen recently.

Climate alarmists don’t want to talk about the uncomfortable fact we just went through an historic 12-year period without a major hurricane (Category 3 or higher) making landfall in the United States. We were blessed for more than a decade without a devastating hurricane wreaking havoc on our shores, so having two hurricanes making landfall this year, while unfortunate, should not be unexpected.



What is disappointing, unnecessary, and counterproductive is the attempt to somehow link carbon dioxide emissions to hurricane activity. Both the IPCC and NOAA have made clear statements that there is no scientific evidence that links CO2 levels with extreme weather, including tropical systems. Any linkage is purely hypothetical, and is more likely a political rather than a scientific determination.

In the real world, fossil fuels are what make our communities resilient. They provide us the concrete to reinforce our homes, the fuel to help move us away from danger, the materials to preserve and rebuild our infrastructure, and the electricity to bring back our communities when the storm passes. They power the rescue boats and aircraft, help deliver the food and water to stricken communities, and power the chainsaws that allow us to clear the debris.

Carbon dioxide is not powering these storms, but it does make our lives better both before they hit, and after they leave disaster in their wake.

Michael Sununu is a consultant with Sununu Enterprises LLC and lives in Newfields.

Posted on 09/18 at 06:41 AM
(2) TrackbacksPermalink

Sunday, September 17, 2017
Finally, some commonsense western fire policies

New DOI and DOA policy to cut overgrown, diseased, dead and burned trees is long overdue

Paul Driessen

President Trump promised to bring fresh ideas and policies to Washington. Now Interior Secretary Ryan Zinke and Agriculture Secretary Sonny Perdue are doing exactly that in a critically important area: forest management and conflagration prevention. Their actions are informed, courageous and long overdue.

Westerners are delighted, and I’ve advocated such reforms since my days on Capitol Hill in the 1980s.

As of September 12, amid this typically long, hot, dry summer out West, 62 major forest fires are burning in nine states, the National Interagency Fire Center reports. The Interior Department and Ag Department’s Forest Service have already spent over $2 billion fighting them. That’s about what they spent in all of 2015, previously the most costly wildfire season ever, and this season has another month or more to go. The states themselves have spent hundreds of millions more battling these conflagrations.

Millions of acres of forest have disappeared in smoke and flames - 1.1 million in Montana alone. All told, acreage larger than New Jersey has burned already. However, even this hides the real tragedies.

The infernos exterminate wildlife habitats, roast eagle and spotted owl fledglings alive in their nests, immolate wildlife that can’t run fast enough, leave surviving animals to starve for lack of food, and incinerate organic matter and nearly every living creature in the thin soils. They turn trout streams into fish boils, minus the veggies and seasonings. Future downpours and rapid snowmelts bring widespread soil erosion into streambeds. Many areas will not grow trees or recover their biodiversity for decades.

Most horrifically, the conflagrations threaten homes and entire communities. They kill fire fighters and families that cannot get away quickly enough, or get trapped by sudden walls of flames.

In 2012, two huge fires near Fort Collins and Colorado Springs, Colorado burned 610 homes, leaving little more than ashes, chimneys and memories. Tens of thousands of people had to be evacuated through smoke and ash that turned daytime into choking night skies. Four people died. A 1994 fire near Glenwood Springs, CO burned 14 young firefighters to death.

These are not “natural” fires of environmentalist lore, or “ordinary” fires like those that occur in state and privately owned and managed forests. Endless layers of laws, regulations, judicial decrees and guidelines for Interior and Forest Service lands have meant that most western forests have been managed like our 109 million acres of designated wilderness: they are hardly managed at all.

Environmentalists abhor timber cutting on federal lands, especially if trees might feed profit-making sawmills. They would rather see trees burn, than let someone cut them. They constantly file lawsuits to block any cutting, and too many judges are all too happy to support their radical ideas and policies.

Thus, even selective cutting to thin dense stands of timber, or remove trees killed by beetles or fires, is rarely permitted. Even fire fighting and suppression are often allowed only if a fire was clearly caused by arson, careless campers or other human action - but not if lightning ignited it. Then it’s allowed to burn, until a raging inferno is roaring over a ridge toward a rural or suburban community.

The result is easy to predict. Thousands of thin trees grow on acreage that should support just a few hundred full-sized mature trees. Tens of billions of these scrawny trees mix with 6.3 billion dead trees that the Forest Service says still stand in eleven western states. Vast forests are little more than big trees amid closely bunched matchsticks and underbrush, drying out in hot, dry western summers and droughts - waiting for lightning bolts, sparks, untended campfires or arsonists to start super-heated conflagrations.

Flames in average fires along managed forest floors might reach several feet in height and temperatures of 1,472 F (800 C), says Wildfire Today. But under extreme conditions of high winds and western tinderboxes, temperatures can exceed 2,192 F (1,200 C), flame heights can reach 165 feet (50 meters) or more, and fires can generate a critter-roasting 100,000 kilowatts per meter of fire front. Wood will burst into flame at 572 F. Aluminum melts at 1,220 degrees, silver at 1,762 and gold at 1,948 F!

Most of this heat goes upward, but super-high temperatures incinerate soil organisms and organic matter in thin western soils that afterward can support only stunted, spindly trees for decades.

These fires also emit prodigious quantities of carbon dioxide, fine particulates and other pollutants - including mercury, which is absorbed by tree roots from rocks and soils that contain this metal, and then lofted into the sky when the trees burn.

Rabid greens ignore these hard realities - and divert discussions back to their favorite ideological talking points. The problem isn’t too many trees, they insist. It’s global warming and climate change. That’s why western states are having droughts, long fire seasons, and high winds that send flames past fire breaks.

Global warming, global cooling and climate change have been part of the Earth and human experience from time immemorial. Natural climate fluctuations brought the multi-decade Anasazi drought, the Dust Bowl and other dry spells to our western states. To suggest that this summer’s heat and drought are somehow due to mankind’s fossil fuel use and related emissions is deliberately delusional nonsense.

Neither these activists nor anyone in Al Gore’s climate chaos consortium can demonstrate or calibrate a human connection to droughts or fires. Rants, rhetoric and CO2-driven computer models do not suffice. And even if manmade (plant-fertilizing) carbon dioxide does play a role amid the powerful natural forces that have always controlled climate and weather, reducing US fossil fuel use would have zero effect.

China, India, Indonesia and Vietnam alone are building 590 new coal-fired power plants right now, on top of the hundreds they have constructed over the past decade. Overall, more than 1,600 new coal generators are planned or under construction in 62 countries. People in developing countries are also driving far more vehicles and making great strides in improving their health and living standards. They will not stop.

Western conflagrations jump fire breaks because these ferocious fires are fueled by the unprecedented increase in combustibles that radical green policies have created. These monstrous fires generate their own high winds and even mini tornados that carry burning branches high into the air, to be deposited hundreds of feet away, where they ignite new fires. It has nothing to do with climate change.

Remove some of that fuel - and fires won get so big, hot, powerful and destructive. We should also do what a few environmentalist groups have called for: manage more areas around buildings and homes - clearing away brush that federal agencies and these same groups have long demanded be left in place.

Finally, we should be using more of the readily available modern technologies like FireIce from GelTech Solutions. They can suppress and extinguish fires, and protect homes, much better than water alone.

The last bogus eco-activist claim is that “fire isn’t destruction; it’s renewal. It creates stronger, more diverse ecosystems.” That may be true in managed forests, timber stands in less tinder-dry states, and forests that have undergone repeated, non-devastating fires. For all the reason presented above, it is not true for government owned and mismanaged forests in our western states.

Over 50 million acres (equal to Minnesota) are at risk of catastrophic wildfires. Right now, we are spending billions of dollars we don’t have, should not have to spend fighting all these monstrous killer blazes, and should have available to improve forests and parks and fund other vital programs.

These forests could and should create jobs and generate revenues in states where far too many lands, timber, oil and minerals have been placed off limits - primarily by urban politicians, judges and radical activists who seem determined to drive people off these western lands, turn them into playgrounds for the wealthy, and roll back other Americans’ living standards and well-being. Cleaning out dead, diseased, burned, overgrown trees would bring countless benefits. It would make our forests healthy again.

Above all, the new Interior-Agriculture approach would demonstrate that Rural Lives Matter.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (, and author of Eco-Imperialism: Green power - Black death and other books on the environment.


Chris Mooney, WAPO’s alarmist Science editor asked for a comment about my being considered for the Science Advisory Board of Pruit’s EPA. 

When I was working on a doctoral traineeship grant at NYU in Air Resources in the 1970s, we had real air and water pollution issues. We as a nation with the EPA in the early days did a commendable job cleaning up up our air and water with reasonable clean up measure on energy plants and automobiles. We have more than met our goal in reducing levels of criteria pollutants below the reasonable standards set years ago. Carbon pollution (particulates) are a problem in China and India but we in the U.S. have reduced particulate levels 50% the last few decades and are now well below the aggressive standards we set.  We rarely see air pollution advisories today, something very common decades ago.

This hyper focus on controlling CO2 in recent decades is immoral - harmful to our nation, its’ people and their future.  CO2 is a beneficial trace gas (0.04% by volume). It is a plant fertilizer that has helped us increase crop yields 3 to 5 fold. The Endangerment Finding being used by the EPA (and courts) to regulate CO2 emissions has been invalidated by real world data and needs to be scrapped and redone using good data and science and not failed models. Scientists and econometricians I worked with have done sold research reports that show natural factors are responsible for all the cyclical changes and claims about changes and extremes are not supportable. 

Europe, Australia and the green agenda states here including California and the northeast RGGI states which pushed the green agenda, are paying the price with electricity costs 2 to up to 6 times higher than we pay in most other states. This affects the poor and middle class the most (a hidden tax) and drives out industry which costs jobs (Spain reach 27.5% unemployment before they stopped subsides and lost 4 real jobs for every temporary green job created). More than 25% of Britons, many of them pensioners, are in what is called energy poverty, having to choose between heating and eating. In Europe and Australia bad policies have led to power blackouts or brownouts and a rush to build coal plants to make up for the intermittent and unreliable wind and solar output.

There is still a need for environmental protection. We need to refocus the EPA on ensuring there is no more Animas River or Flint water events. We need to help deal with the issues of mold and water pollution in areas where heavy flooding like we saw with Harvey and Irma occur. There is work to be done i waste management and ensuring ground water is safe, We need to work with other government departments to develop more sane forest management that reduces wildfire risks with their smoke pollution and reasonable spring and summer streamflow water runoff usage policies to benefit agriculture and the cities that need clean water.

Note I had alluded to the need to deal with the forestry and water supply issues. Reporters don’t like when you send a response in writing as that allows them to paraphrase your response in a way that aligns with their message. He quickly called to throw other questions about me. i will report how it goes. A similar thing happened last week with an Environmental newsletter trying to bash those candidate scientists that could affect their green agenda with some out of context and false quotes. Even the Washington Examiner a usually fair and balanced source of information had an article with several inaccuracies. The author on Linkedin pointed to Richard Branson and Arianna Huffington as influential interests which explains a lot. Friends the fake science reporters are everywhere.

Posted on 09/17 at 10:28 AM
(0) TrackbacksPermalink

Thursday, September 14, 2017
A Global Warming Red Team Warning: Do Not Strive For Consensus With The Blue Team

Dr. Roy Spencer

Now that the idea of a global warming Red Team approach to help determine what our energy policy should be is gaining traction, it is important that we understand what that means to some of us who have been advocating it for over 10 years - and also what it doesn’t mean.

The Red Team approach has been used for many years in private industry, DoD, and the intelligence community to examine very costly decisions and programs in a purposely adversarial ask, what if we are wrong about a certain program or policy change? What might the unintended consequences be?

In such a discussion we must make sure that we do not conflate the consensus on a scientific theory with the need to change energy policy, as is often done. (Just because we know that car wrecks in the U.S. cause 40,000 deaths a year doesn’t mean we should outlaw cars; and I doubt human-caused climate change has ever killed anyone).

While science can help guide policy, it certainly does not dictate it.

In the case of global warming and the role of our carbon dioxide emissions, the debate has too long been dominated by a myopic view that asserts the following 5 general points as indisputable. I have ordered them generally from scientific to economic.

1) global warming is occurring, will continue to occur, and will have dangerous consequences

2) the warming is mostly, if not totally, caused by our CO2 emissions

3) there are no benefits to our CO2 emissions, either direct (biological) or indirect (economic)

4) we can reduce our CO2 emissions to a level that we avoid a substantial amount of the expected damage

5) the cost of reducing CO2 emissions is low enough to make it worthwhile (e.g. mandating much more wind, solar, etc.)

ALL of these 5 points must be essentially true for things like the Paris Agreement (which President Trump has now withdrawn us from...for the time being) to make much sense.

But I would argue that each of the five points can be challenged, and not just with “fake science”. There is peer-reviewed and published analysis in science and economics that would allow one to contest each one of the five claims.

The Red Team Approach: It’s NOT a Redo of the Blue Team

John Christy and I are concerned that the Red Team approach, if applied to global warming, will simply be a review of the U.N. IPCC science on global warming. We are worried that it will only address the first two points (warming will continue, and it is mostly caused by CO2). Heck, even *I* believe we will continue to see modest warming, and that it might well be at least 50% due to CO2.

But a Red Team reaffirming those points does NOT mean we should “do something” about global warming.

To fully address whether we should, say, have regulations to reduce CO2 emissions, the Red Team must address all 5 of the “consensus” claims listed above, because that is the only way to determine if we should change energy policy in a direction different from that which the free market would carry it naturally.

The Red Team MUST address the benefits of more CO2 to global agriculture, “global greening” etc.

The Red Team MUST address whether forced reductions in CO2 emissions will cause even a measurable effect on global temperatures.

The Red Team MUST address whether the reduction in prosperity and increase in energy poverty are permissible consequences of forced emissions reductions to achieve (potentially unmeasurable) results.

The membership of the Red Team will basically determine the Team’s conclusions. It must be made up of adversaries to the Blue Team “consensus”, which has basically been the U.N. IPCC. If it is not adversarial in membership and in mission, it will not be a real Red Team.

As a result, the Red Team must not be allowed to be controlled by the usual IPCC-affiliated participants.

Only then can its report can be considered to be an independent, adversarial analysis to be considered along with the IPCC report (and other non-IPCC reports) to help guide U.S. energy policy.

Posted on 09/14 at 08:01 AM
(1) TrackbacksPermalink

Tuesday, September 05, 2017
Powerful CAT5 IRMA a major threat next to Florida

By Joseph D’Aleo, CCM, AMS Fellow

A look back on Irma and Harvey :

Update: irma will make landfall in south Florida Sunday as a major hurricane. Models which had teased a track along the eastern parts of the state are talking aim at the keys and then the entire peninsula.


In its devastating journey through the northern Leeward Islands, Virgin Islands as a CAT 5 storm with winds estimated at 185mph, it ranked second for wind and 10 for lowest pressures.



It has weakened to a CAT 4 with winds of 155mph and central pressure of 925mb. After moving along the Cuba coast, it will turn north and likely intensify again.

Join us on to see all the details in posts and videos.


The name Irma is derived from the German ‘Ermen’ which mean ‘whole’. You can drop the ‘w’, in terms of pressure it has become a literal ‘hole’ in the atmosphere.  Irma has intensified to a powerful very dangerous CAT5 hurricane with a central pressure of 929mb and winds of 175mph.

Here is the latest water vapor image.


Here is the projected 5 day tracking skirting the islands on its way west.


It is forced west by a strong high pressure in the Atlantic, ‘king ridge’ as my meteorological buddy Al Lipson calls it. It won’t make its turn north (something all tropical storms look to do at first opportunity) until the ridge weakens or the storm reaches the western end of the ridge. This is the first and often the biggest challenge in hurricane forecasting.


See also the cold trough to the north in the Great Lakes into the northeast. See how that cold trough north moves out into the Atlantic and the ridge collapses this weekend.


See how in the lower levels the trough to the north is a cold one, the low above Irma is warm core. At 18,000 feet it is colder than -20C in the northern trough but more than +3C over Irma. Above freezing temperatures at 18,000 are confined to the strongest heat waves and the stronger hurricanes.


See the US GFS model take Irma into Florida with pressures in the 880s mb. The strongest Atlantic Basin storm was Wilma in 2005 with 882mb central pressure in the Gulf. It weakened a bit to CAT3 when it slammed Florida.



See the ensemble members of the GFS mainly target Florida. It should be noted some models and ensemble members of the other models turn north before Florida and others take the storm west of Florida before turning north.

Did you ever miss an exit on the highway or the street you wanted to turn on. You have to go to the next exit or street that is one way the right way. Do you remember Katrina which was rolling southwestward and faster than the models which had it turning just west of the Florida towards Apalachicola. It went out into the central Gulf before turning and the rest was history.


Just like Harvey ended the record major hurricane nearly 12 year drought for the U.S., it it makes landfall in Florida it will end the record hurricane drought also since Wilma in 2005.


Please take this storm very seriously and consider your options and follow local emergency management directives. Join us at where our team is doing frequent updates and videos.

Posted on 09/05 at 08:13 AM
(1) TrackbacksPermalink

Monday, September 04, 2017
Roger Pielke Jr.: “The Hurricane Lull Couldn’t Last”

The Hurricane Lull Couldn’t Last


The U.S. hadn’t been hit by a Category 3 or stronger storm since Katrina in 2005. We were overdue.
By Roger Pielke Jr.  Aug. 31, 2017 7:09 p.m. ET

Activists, journalists and scientists have pounced on the still-unfolding disaster in Houston and along the Gulf Coast in an attempt to focus the policy discussion narrowly on climate change. Such single-issue myopia takes precious attention away from policies that could improve our ability to prepare for and respond to disasters. More thoughtful and effective disaster policies are needed because the future will bring many more weather disasters like Hurricane Harvey, with larger impacts than those of the recent past.

For many years, those seeking to justify carbon restrictions argued that hurricanes had become more common and intense. That hasn’t happened. Scientific assessments, including those of the Intergovernmental Panel on Climate Change and the U.S. government’s latest National Climate Assessment, indicate no long-term increases in the frequency or strength of hurricanes in the U.S. Neither has there been an increase in floods, droughts and tornadoes, though heat waves (Icecap Note: not really) and heavy precipitation have become more common.

Prior to Harvey, which made landfall as a Category 4 storm, the U.S. had gone a remarkable 12 years without being hit by a hurricane of Category 3 strength or stronger. Since 1970 the U.S. has only seen four hurricanes of Category 4 or 5 strength. In the previous 47 years, the country was struck by 14 such storms. President Obama presided over the lowest rate of hurricane landfalls -0.5 a year - of any president since at least 1900. Eight presidents dealt with more than two a year, but George W. Bush (18 storms) is the only one to have done so since Lyndon B. Johnson. The rest occurred before 1960.

Without data to support their wilder claims, climate partisans have now resorted to shouting that every extreme weather event was somehow “made worse” by the emission of greenhouse gases. Earlier this week, New York Times columnist David Leonhardt directed researchers “to shed some of the fussy over-precision about the relationship between climate change and weather.”


Wall Street Journal

And some bad news for the alarmists…

Gulf of Mexico operators returning to work after Harvey


Offshore staff

NEW ORLEANS - About 9% of oil production and 13% of natural gas production remains shut-in in the Gulf of Mexico, according to the Bureau of Safety and Environmental Enforcement.


BSEE added that no damage reports from oil and gas operators have been received.


As offshore oil and gas operations return to normal, the industry continues to provide assistance for the onshore Hurricane Harvey relief efforts.

Hess Corp. has donated $1 million to the Hurricane Harvey Relief Fund. The company said that it will match every donation employees make in the coming weeks to relief efforts by the Hurricane Harvey Relief Fund, American Red Cross, and United Way of Houston.

Transocean says that it has contributed $100,000 to the American Red Cross and $100,000 to the Houston Food Bank. The company says that it will also match donations made to the relief efforts by its employees.

Statoil announced via social media that it has donated $250,000 to the Red Cross.

Weatherford International plc says that is has pledged $25,000 to Feeding Texas, the Texas Food Bank Network, and $25,000 to J.J. Watt’s Hurricane Harvey Relief Fund.

ExxonMobil says that it has increased its financial commitment for Harvey relief to up to $9.5 million, which includes a new employee and retiree donation match program and in-kind donations to the American Red Cross for recovery efforts in South Texas. The increased support builds on $1 million in previous contributions to the American Red Cross and United Way of Greater Houston.

Offshore Magazineimage

Posted on 09/04 at 04:52 PM
(1) TrackbacksPermalink

Wednesday, August 30, 2017
Harvey makes second landfall and heads northeast

By Joseph D’Aleo, WeatherBELL Analytics, LLC

TS Harvey’s center crossed the coast just west of Cameron, Louisiana, with most of the associated deep convection located over extreme southeastern Texas and western Louisiana early this morning.


NHC reports at 4am CDT - Although the rain has ended in the Houston/Galveston area, the Beaumont/Port Arthur area was particularly hard hit overnight, with about 12.5 inches reported at the Jack Brooks Regional Airport since 7 pm CDT

See the large area of 20” rains in southeast Texas to southwest Louisiana.


See 1 gauge reported 51.88”.  Harris county maintains a network of 156 reported rainfall gauges and stream gauges. See most of the amounts in the city were in the 30 to low 40 inch range. The heaviest was in the southeast.


39 gauges were in major flood, 11 more moderate flooding.


As Dr Spencer pointed out, There have been many flood disasters in the Houston area, even dating to the mid-1800s when the population was very low. In December of 1935 a massive flood occurred in the downtown area as the water level height measured at Buffalo Bayou in Houston topped out at 54.4 feet.


The Buffalo Bayou gauge topped out with Harvey at around 39 feet on the 28th and dropped a bit, recovered to 37 feet then has been receding.


The three stations that exceeded the 48 inch record set in 1978 in Amelia put Harvey the top of the list of tropical rainmakers in the lower 48 states as WeatherBELL predicted several days ago. 6 of the 10 storms were in Texas where storms often get trapped in summer or fall.


Here was Amelia in 1978 with the 48 inch total for 7 days at Medina. The gauge density there obviously was less and we can’t know with certainty if there was more with that and other storms on the list.


See the storm when it came ashore as a CAT4.


It was the first landfalling hurricane and major hurricane this decade in Texas. The last major was Bret in 1999. Rita and Ike came close in 2004 and 2008.


It was tied for 14th place by pressure (Klotzbach)


We have with Harvey had 7 landfalling hurricanes this decade. We have to have 8 more the rest of this season and in 2018 and 2019 to keep this decade from being the quietest on record.


Posted on 08/30 at 07:53 AM
(2) TrackbacksPermalink

Saturday, August 12, 2017
Why Climate Alarmist Reports Should Be Ignored Where They Use Bad Methodology and Data

Looks like the public agrees with Alan.

Scott Rasmussen’s Number of the Day
By Scott Rasmussen

August 21, 2017: Twenty-eight percent (28%) of Americans think that climate scientists understand the causes of global climate change “very well.” A Pew Research study found that only 19% believe that the climate scientists have a very good understanding of the best ways to address the issue.

In general, the study found that Americans trust climate scientists more than politicians on the topic. Two-thirds (67%) believe scientists should play a major role in addressing policy issues on the matter. Most (56%) also believe that energy industry leaders (56%) and the general public (56%) should have a major say in such policy topics.

The Pew study, however, also found that people believe there are differences of opinion among the climate scientists. Only 27% believe that there is a consensus on the issue and that just about all climate scientists believe human behavior is mostly responsible for global climate change. Another 35% think more than half hold this view.

The survey also explored the degree of trust and confidence in those researching climate science. Thirty-six percent (36%) believe that, most of the time, scientists’ research findings are motivated by a desire to advance their own careers. Only 32% say that they mostly rely on the best scientific evidence. Twenty-seven percent (27%) believe that political views of the scientists generally influence their work.

Liberal Democrats tend to express high levels of confidence in the climate scientists and their motives. Conservative Republicans are often quite skeptical. Most other Americans have mixed views.


Alan Carlin August 9, 2017

Like other liberal news outlets, the New York Times has been busy printing unapproved internal Trump Administration material this year. On August 8, 2017 they printed a Draft Report as part of a new National Climate Assessment. It was prepared primarily during the Obama Administration by a Federal inter-agency group and is still residing on an outside server from an earlier public comment period. They concluded, among other things, that “Many lines of evidence demonstrate that human activities, especially emissions of greenhouse (heat-trapping) gases, are primarily responsible for recent observed climate change.”

The problem is not that the viewpoints expressed are new or useful or that the draft was not already available; rather they represent a rather tired repetition of the usual climate alarmist ideology with only occasional updates. This is unfortunate since it is becoming ever clearer that the ideology has become scientifically indefensible and needs to be abandoned in favor of a new approach to climate science.

Perhaps the Most Basic Problem

Perhaps the most basic problem with this Draft Report, like most of the major Climate Industrial Complex (CIC) reports, is that it primarily depends for its justification on the IPCC’s bottom-up global climate models (as they discuss in Section 4.3 of the Draft Report). The Draft Report shows that the climate alarmists have by no means given up their horrifically expensive and misguided crusade to reduce carbon dioxide (CO2) emissions, despite that the alarmists’ very extensive attempt to justify it is hopeless.

Not only is their conclusion that global warming is primarily due to human activity, but also that temperatures will increase significantly because of increases in anthropogenic atmospheric CO2. Their basic methodology is based on the UN Intergovernmental Panel on Climate Change’s (IPCC’s) analyses conducted over many years. The Heartland Institute has gone to great effort to point out many of the problems and inconsistencies in the conclusions reached using these models. But it is increasingly clear why the IPCC has been having a hard time explaining the increasing divergence between their models and actual temperatures. One of the basic problems is that alarmists have always used a bottom-up approach in their methodology (which is to aggregate the results for individual geographic areas based on the application of subjective physical relationships between various physical effects). This approach cannot produce valid results no matter how much is spent on it, how often it is repeated, or how large the climate models they use. As Mike Jonas has recently written:

In this very uncertain world of climate, one thing is just about certain: No bottom-up computer model will ever be able to predict climate. We learned above [in the article this was excerpted from] that there isn’t enough computer power now even to model GCRs [galactic cosmic rays], let alone all the other climate factors. But the issue of computer model ability goes way beyond that. In a complex non-linear system like climate, there are squillions of situations where the outcome is indeterminate. That’s because the same influence can give very different results in slightly different conditions. Because we can never predict the conditions accurately enough - in fact we can’t even know what all the conditions are right now - our bottom-up climate models can never ever predict the future. And the climate models that provide guidance to governments are all bottom-up.

The bottom-up GCM was a bad approach from the start and should never have been paid for by the taxpayers. All that we have are computer models that were designed and then tuned to lead to the IPCC’s desired answers and have had a difficult time even doing that.

So not only are the results claiming that global temperatures are largely determined by atmospheric CO2 wrong, but the basic methodology is useless. Climate is a coupled, non-linear chaotic system, and the IPCC agrees that this is the case. It cannot be usefully modeled by using necessarily limited models which assume the opposite.

An Entirely New Approach Is Needed

Despite repeated claims by climate alarmists that climate science is settled, nothing could be further from the case. In fact, an entirely new approach is needed if much progress is to be made in characterizing and understanding the climate system. This approach must be a top-down rather than a bottom-up approach. To my knowledge, only one such study (and earlier versions thereof) exists taking this approach, which I will call the 2017 WCD report after the authors’ last names. And it appears to give plausible results. It says that CO2 does not have a significant effect on global temperatures and that global temperatures can be fully explained since about 1960 by entirely natural factors and do not require any human activity to explain what has occurred. This rules out many if not most of the Draft Report’s conclusions.

A second very recent report including two of the same authors as WCD 2017 concludes that the keepers of the official global surface temperature records have repeatedly “adjusted” them to the point that they are no longer representative of the underlying data. Accordingly, the authors argue that the data used in the Draft Report from surface temperature sources and the conclusions reached from using this data are too unreliable for policy use.

The Time Has Come to Abandon the IPCC’s Bottom-up Approach and Correct the Basic Data Used Before Further Expenditures Are Made

It is time to totally abandon the IPCC’s bottom-up climate models as an ultra expensive sunk cost and start over. The 2017 WCD report would be a good place to start in redoing the basic climate analyses. Until this is done, little progress is possible in many of the major issues in climate science, and no further expenditures should be made responding to climate alarmism until the new methodology has been thoroughly tested and the basic surface temperature data has been reconstituted in a useful form. The mistaken choice of methodology has ended up costing taxpayers tens of billions in research costs and has reportedly resulted in about $1.5 trillion per year for renewable and related construction, which needs to be written off too.

I recommend that the Trump Administration issue the Draft Report with an added section explaining how useless and biased the rest of the Draft Report is because it primarily relies on meaningless model results and unreliable surface temperature data. If such a combined report were issued it would be one of the first government reports anywhere to seriously question the IPCC’s results, and has long been needed. Scientific hypotheses and data that have never been rigorously tested are not fit to be used for public policy purposes, and particularly for those involving multi-trillion expenditures per year.

Posted on 08/12 at 03:20 PM
(1) TrackbacksPermalink

Page 6 of 94 pages « First  <  4 5 6 7 8 >  Last »

Committee for a Constructive Tomorrow (CFACT)

AMSU Global Daily Temps

Anthony Watts Surface Station Photographs

John Coleman’s Corner

Ross McKitrick Google Home Page

Global Warming Hoax

Energy Tribune

The Inhofe EPW Press Blog

Bald-Faced Truth

World Climate Report

Hall of Record

Science Bits

The New Zealand Climate Science Coalition


Climate Skeptic

Climate Debate Daily

The Climate Scam

Watts Up with That?

The Cornwall Alliance

Raptor Education Foundation

The Week That Was by Fred Singer

Where is Global Warming (Bruce Hall Collection)

I Love My Carbon Dioxide


Scientific Alliance

Climate Resistance

Blue Hill Observatory, Milton MA

James Spann’s Blog

Dr. Dewpoint on Intellicast

Dr. Roy Spencer

Climate Depot

Climate Debate Daily

Analysis Online


MPU Blog

CO2 Sceptics

John McLean’s Global Warming Issues

Digging in the Clay

The Weather Wiz

Cornwall Alliance

Reid Bryson’s Archaeoclimatology

Redneck USA

Wisconsin Energy Cooperative

Global Warming Skeptics

Web Commentary

APPINYS Global Warming

Global Warming Scare

Climate Change Fraud

Roy Spencer’s Nature’s Thermostat

Middlebury Community Network on The Great Global Warming Hoax

Blue Crab Boulevard

Tom Nelson Blogroll

Bob Carter’s Wesbite, The Niyogi Lab at Purdue

Musings of the Chiefio

TWTW Newsletters

COAPS Climate Study US

Gore Lied

Climate Audit

Carbonated Climate

John Daly’s What the Stations Say

Gary Sharp’s It’s All About Time

Joanne Nova- The Skeptic’s Handbook

Science and Environmental Policy Project

Science and Public Policy Institute

Climate Science: Roger Pielke Sr. Research Group Weblog

Climate Police

Warwick Hughes

Marshall Institute Climate Change

Earth Changes

Bill Meck’s Blog

Climate Debate Daily

CO2 Science

Finland Lustia Dendrochronology Project

Craig James’ Blog

Weatherbell Analytics

Greenie Watch

Finland Lustia Dendrochronology Project

Global Warming Hoax

Vaclav Klaus, Czech Republic President

Climate Cycle Changes

Intellicast Dr. Dewpoint

Demand Debate

Ice Age Now

Carbon Folly

Art Horn’s “The Art of Weather”

Tom Skilling’s Blog


Right Side News

Warmal Globing

The Heartland Institute

Climate Research News


Tropical Cyclone Blog of Ryan Maue COAPS

Raptor Education Foundation

The Reference Frame - Lubos Motl’s weblog

Junk Science

The Resilient Earth

Accuweather Global Warming

Metsul’s Meteorologia