The right strategy wins the war WeatherShop.com Gifts, gadgets, weather stations, software and more...click here!\
The Blogosphere
Monday, November 06, 2017
Early rebuttals of the scientifically flawed NCA

As one might expect from a UN inspired report, especially one with such a weak team of Lead Authors, this one will set a new record for these reports for bad science. the key findings often do not reflect the material continued within much as was the case with the UN reports and prior government sponsored reports.

There are a few early responses.

2016 National Climate Assessment, a Self Falsifying Prophecy

Guest post by David Middleton on WUWT

There has been some recent “buzz” about the upcoming Fourth National Climate Assessment (NC4), including some moonbat conspiracy theories that the Trump administration will try to suppress or otherwise interfere with the scientific integrity of the report.  The New York Times has already been forced to essentially retract such a claim in a recent article.

If NC4 actually builds upon 2014’s NC3, EPA Administrator Pruitt’s Red Team will have even more material to work with.

Fourth National Climate Assessment

Development of the Fourth National Climate Assessment (NCA4) is currently underway, with anticipated delivery in late 2018. Below you will find information related to NCA4, including a list of chapters, explanation of author roles, and opportunities to participate in the process.

What’s New in NCA4

NCA4 will build upon the successes of the Third National Climate Assessment. Find out more:

LEARN MORE

“NCA4 will build upon the successes of the Third National Climate Assessment"… What success?

Here’s a link to the NCA3 overview.

The first “sciencey” graphic is titled:  Projected Global Temperature Change.

image
Enlarged

And then just enlarged the Epic Failure bits to get the Red Team’s QED:

image
Enlarged

----------------

Then there is Tony Heller’s attack on the temperature claims.

Very High Confidence Of Fraud In The National Climate Assessment

Katharine Hayhoe and her partners in crime have officially released their National Climate Assessment, which includes this graph, which claims “Record Warm Daily Temperatures Are Occurring More Often”.

image

The first thing I noticed is that the text in the report does not match the graph. They say :

The Dust Bowl era of the 1930s remains the peak period for extreme heat in the United States

Yet the graph right below it does not show the 1930ís as being hot.

image
Enlarged

Unfortunately for Katharine and her band of climate fraudsters, I have software which does this calculation. The graph below is the correct version. Their graph is more or less correct after 1970 - but the pre-1970 data is completely fraudulent. They removed all of the hot weather from 1930 to 1954.  NOAA does not make adjustments to daily temperatures, so they can’t use that excuse.

image
Enlarged

The report claims :

Record Warm Daily Temperatures Are Occurring More Often

That is an outright lie.  Record warm daily maximum temperatures have decreased sharply since 1930 - the start date of their graph.

image
Enlarged

The number of record daily minimums has also decreased.

image
Enlarged

The US climate is getting milder, with fewer very hot or very cold days.

image
Enlarged

So why the big spikes in 2012 and 2016?

image
Enlarged

This is a classic divide by zero error. Ratios become unstable when the denominator becomes small. The numbers are meaningless. No serious scientist would release a wildly flawed and dishonest graph like Katharine Hayhoe does on a consistent basis.

See Tonys video:

See one more detailed assessment by Judith Curry.

Posted on 11/06 at 08:21 PM
(1) TrackbacksPermalink


Monday, October 09, 2017
Misuse of the scientific method has led to peer review failures with significant implications

By Joseph D’Aleo, CCM, AMS Fellow

See this excellent video from Tony Heller challenging the climate mafia and their continuing adjustment of data. No scientific method is being applied to the work of Government agencies and their cohorts like RSS.

THE SCIENTIFIC METHOD

The scientific method in science is a well established iterative process. The scientific method starts with a theory or hypothesis. The data needed to test it and all possible factors involved are identified and gathered. The data is processed and the results rigorously tested. The data and methods are made available for independent replication. Reviewers for the proposed theory must have the requisite skills in the topic and in the proper statistical analysis of the data to judge its validity. If it passes the tests and replication efforts, a conclusion is made and the theory may be turned into a paper for publication. If it fails the tests, the hypothesis or theory must be rethought or modified.

Astronomer Carl Sagan, Professor and Director of Cornell University’s Laboratory for Planetary Studies and host of the series Cosmos a Personal Voyage in a 1995 book The Demon-Haunted World: Science as a Candle in the Dark explained the scientific method and encouraged critical and skeptical thinking. He emphasized the importance of recognizing the difference between what is considered valid science and which is in reality pseudoscience.

Sagan like fellow Cornell physicist/lecturer Richard Feynman argued when new ideas are offered for consideration, they should be tested by means of skeptical thinking and should stand up to rigorous questioning. Feynman lectured;

“If a theory or proposed law disagrees with experiment (or observation), it’s wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, it doesn’t matter how smart you are who made the guess, or what your name is.... If it disagrees with experiment, it’s wrong. That’s all there is to it.”

Sir Karl Popper, an Austrian-British philosopher and professor is generally regarded as one of the greatest philosophers of science of the 20th century. Popper is known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: A theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments.

image
Enlarged

It should noted a refutation of a previously accepted theory even one that has been published and widely accepted can follow the same route to review and publication as Albert Einstein observed:

image
Enlarged

The peer review process is failing due to political and economic pressures that have altered the scientific method to virtually ensure a politically correct or economically fruitful theory can never fail.

When the tests fail, instead of rethinking the theory or including other factors, there is an alarming tendency to modify input data to more closely fit the theory or models.

image
Enlarged

Also often, the authors and reviewers do not to have the proper understanding of all the factors involved and often the needed mathematical skills to properly evaluate the results.  And even if they do, the input data and methods are generally not made available to the reviewers for replication.  And in many cases, forecasts are made for many decades or even centuries into the future, so true validation is not possible, a luxury those of us who must forecast in shorter time frames (days to seasons) do not enjoy.

Also too often, the reviewers that then serve as final gatekeepers are often not only not fully capable of this kind of rigorous review, they are often biased and speed politically correct or economically beneficial work to publication while blocking or at least ‘slow walking’ work that challenges the so-called consensus science or their own often ideologically driven beliefs.

As Dr. Michael Crichton wrote “Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. In science, consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus. (Galileo, Newton, Einstein, etc)”.

SCIENTIFIC METHOD FAILURE IN THE CLIMATE SCIENCES

So when greenhouse climate models fail, they don’t revisit the theory but instead try and find the right data to fit that model. All data today is adjusted with models with a goal of addressing data errors, changes in location or instrumentation or addressing changing distribution or filling in for missing data or station closures. Once you start this adjustment process, it becomes increasingly possible to the find ways to mine from the data the desired results.

With the climate models there is an increasingly large divergence with balloon, satellite and surface reanalysis data sets the last 20 years. The one model that follows the temperature is a Russian model that has roughly half the greenhouse forcing and improved ocean modeling.

image
Enlarged

John Christy 2017 has shown models without greenhouse warming agreed perfectly with atmospheric (tropical) observations.

image
Enlarged

This kind of refutation should, if scientists abided by the scientific method, spark an effort to revisit the theory but that is too politically incorrect. This kind of ideologically or politically or economically driven thinking is pervasive across the sciences (atmospheric and medical).

EVIDENCE THAT TRADITIONAL PEER REVIEW IS FAILING

There is increasing proof that the traditional journal peer review process is broken. This is true in the Medical and Scientific areas. 

See this example of one such falsified report that the author worries is a part of an epidemic of agenda-driven science by press release and falsification that has reached crisis proportions.

Other reports show an alarming number of papers having to be retracted.  Springer is retracting 107 papers from one journal after discovering they had been accepted with fake peer reviews (here).

Result-oriented corruption of peer review in climate science was proven by the Climategate emails.

In the journals, there are a small set of gatekeepers that block anything that goes against the editorial biases of the journals. Conversely, these journals and their reviewers do not provide a thorough due diligence review of those that they tend to agree with ideologically.  They are engaged in orthodoxy enforcement.

In an essay ”Has Science Lost its Way?”, Michael Guillen Ph.D wrote about Science’s reproducibility crisis.

For any study to have legitimacy, it must be replicated, yet only half of medical studies celebrated in newspapers hold water under serious follow-up scrutiny - and about two-thirds of the “sexiest” cutting-edge reports, including the discovery of new genes linked to obesity or mental illness, are later “disconfirmed.” Though erring is a key part of the scientific process, this level of failure slows scientific progress, wastes time and resources and costs taxpayers excesses of $28 billion a year, writes NPR science correspondent Richard Harris.

The single greatest threat to science right now comes from within its own ranks. Last year Nature, the prestigious international science journal, published a study revealing that “More than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments.”

The inability to confirm research that was published in highly respected, peer-reviewed journals suggests something is very wrong with how science is being done.
The crisis afflicts even science’s most revered ‘facts,’ as cancer researchers C. G. Begley and Lee Ellis discovered. Over an entire decade they put fifty-three published “landmark” studies to the test; they succeeded in replicating only six - that’s an 11% success rate.

A major culprit, they discovered, is that many researchers cherry-picked the results of their experiments - subconsciously or intentionally - to give the appearance of success, thereby increasing their chances of being published.

“They presented specific experiments that supported their underlying hypothesis, but that were not reflective of the entire data set,” report Begley and Ellis, adding this shocking truth: There are no guidelines that require all data sets to be reported in a paper; often, original data are removed during the peer review and publication process’

Another apparent culprit is that - and it’s going to surprise most of you - too many scientists are actually never taught the scientific method. As graduate students, they take oodles of courses in their chosen specialty; but their thesis advisors never sit them down and indoctrinate them on best practices. Consequently, remarks University of Wisconsin-Madison biologist Judith Kimble: “They will go off and make it worse.”

This observation seems borne out by the Nature study, whose respondents said the three top weaknesses behind science’s reproducibility crisis are: 1) selective reporting, 2) pressure to publish, and 3) low statistical power or poor analysis. In other words, scientists need to improve on practicing what they preach, which is: 1) a respect for facts - all of them, not just the ones they like, 2) integrity, and 3) a sound scientific method.

The attendees of the so-called ‘Earth Day’ March for Science made a lot of noise about wanting more money and respect from the public and government - what group wouldn’t want that? But nary a whisper was heard from them or the media about science’s urgent reproducibility crisis. Leaving unspoken this elephant-sized question: If we aren’t able to trust the published results of science, then what right does it have to demand more money and respect, before making noticeable strides toward better reproducibility?

Michael Guillen Ph.D., former Science Editor for ABC News, taught physics at Harvard and author of “The Null Prophecy”.

FEEDBACK ON OUR RESEARCH REPORTS

Although well received and widely distributed, our recent press release and research paper hit a raw nerve with alarmists. The research sought to validate the current estimates of Global Average Surface Temperatures (GAST) using the best available relevant data. The conclusive findings were that the three GAST data sets are not a valid representation of reality. In fact, the magnitude of their historical data adjustments, which removed their cyclical temperature patterns, is totally inconsistent with published and credible U.S. and other temperature data.

Thus, despite current claims of record setting warming, it is impossible to conclude from the NOAA, NASA and UK Hadley CRU GAST data sets that recent years have been the warmest ever.

Finally, since GAST data set validity is a necessary condition for EPA’s CO2 Endangerment Finding, it too is invalidated by these research findings. This means that EPA’s 2009 claim that CO2 is a pollutant has been decisively invalidated by this research.

We had shown in prior research reports here and here how even if you ignore the adjustments, the changes observed can be explained entirely by natural factors (ocean cycles, solar cycles and volcanism). If one considers the urban heat island contamination of surface date, the idea that temperatures may actually be declining since the 1930s in cyclical fashion, very much in line with record highs.

The media fact checkers, which serve often as enforcers of orthodoxy, could not meaningfully question the data or science presented but challenged the claim that it was ‘peer reviewed’ (in the sense the peer review process has been defined today by the ‘advocacy’ journals’ (really ‘pal review’wink.

Our research reports were rigorously peer reviewed by top scientists. The reports follow the approach long used in industry often for their own internal use. The reports were prepared by author teams with the requisite skills at proper data collection, a deep understanding of the scientific factors involved and statistical skills to evaluate what best explains the observed changes.

To abide by the scientific method, the work must be capable of being replicated. Our highly qualified reviewers who endorsed it are capable of evaluating the work scientifically and or statistically. They approval includes a willingness, even eagerness to endorse the work. The data and the methodology is available for others to replicate.

Our approach follows the long accepted application of the scientific method in a world where science is too politicized.


Posted on 10/09 at 07:30 AM
(1) TrackbacksPermalink


Monday, October 02, 2017
Chief science adviser attacks academic ‘arrogance’ on policy

Times Higher Education

The chief science adviser to the prime minister of New Zealand has accused scientists of displaying “hubris” and “arrogance” when they comment on government policy.

image

Sir Peter Gluckman, who also chairs the International Network for Science Advice to Governments, leveled a series of sharp criticisms at researchers and science organizations during an event in Brussels that debated the role of policy and evidence in a “post-fact” world.

Working on an allotment

University rankings ‘must give greater recognition’ to engagement

He argued that scientists needed to appreciate that politicians made their decisions based on values as well as scientific evidence.

‘Individual scientists, professional and scientific organizations too often exhibit hubris in reflecting on policy implications of science,” Sir Peter told delegates at “EU for facts: evidence for policy in a post-fact world”, held on 26 September.

“This arrogance can become the biggest enemy of science effectively engaging with policy - the policy decisions inevitably involve dimensions beyond science.”

Scientists needed to appreciate that political ideology, financial and diplomatic constraints, and “electoral contracts’ also had to be taken into account by politicians, Sir Peter said. “It is important that [scientific] knowledge is provided [to policymakers] in a way that does not usurp the ability of policy process to consider these broader dimensions: otherwise trust in advice can be lost as it becomes perceived as advocacy,” he argued.

He also said that he avoided using the “somewhat arrogant” term “evidence-based policy”, preferring “evidence-informed” instead. Meanwhile, “too often academy reports are focused on academic demonstration rather than meeting policy needs or answering an unasked question”, he added.

Similar warnings have come from other figures in science. Last year, Jeremy Berg, the editor-in-chief of Science, said that academics have too often ventured into giving policy prescriptions rather than just explaining the evidence, for example in the area of climate change.

Although he named no names, Sir Peter also warned that “individual scientists” were now using their “scientific standing” to make claims “well beyond the evidence and their expertise”. Universities may also “over-hype” their science, he added.

In addition, the pressures of “performance measurement, bibliometrics, and the quest for societal and industrial impact” also have the potential to undermine public trust in science, he said, “due to perceived or actual conflicts of interest and the potential to affect the behaviour of individual scientists”.

At the same conference, Carlos Moedas, European commissioner for research, science and innovation, argued that to combat a “crisis of confidence” in science, there needed to be online “places of trust for scientific advice”, just as sites like Mayo Clinic or WebMD were trusted sources of medical advice.

Such sites would be “where citizens know that science is genuine. Where the process is explained. Where they can check the sources. Where they can access the data themselves,” he said.

“So I believe in the future there will be two types of internet. The one you trust and the one you don’t,” he added.

--------

Prager University has excellent short videos on the topic:

Posted on 10/02 at 03:33 PM
(1) TrackbacksPermalink


Sunday, September 24, 2017
New York Times and Arctic Ice

The New York Times did a story on arctic ice this past week.

See in this video how Tony Heller responds to this story

See in this Icecap story how the changes are cyclical as Tony shows with data and news accounts including the New York Times.

We showed how the water from the Atlantic and Pacific enter the arctic underneath the floating ice. When the Atlantic and Pacific are in their warm modes, this leads to thinning ice and reduced summer coverage.

image
Enlarged

The UAF IARC showed how warm water in the Atlantic warms the arctic and reduces ice.

image
Enlarged

We showed how the Atlantic and Pacific combined warmth and coolness corresponds with arctic temperatures.

image
Enlarged

In the record-setting (since satellite monitoring began in 1979) summer melt season of 2007, NSIDC scientists noted the importance of both oceans in the arctic ice.

“One prominent researcher, Igor Polyakov at the University of Fairbanks, Alaska, points out that pulses of unusually warm water have been entering the Arctic Ocean from the Atlantic, which several years later are seen in the ocean north of Siberia. These pulses of water are helping to heat the upper Arctic Ocean, contributing to summer ice melt and helping to reduce winter ice growth.

Another scientist, Koji Shimada of the Japan Agency for Marine-Earth Science and Technology, reports evidence of changes in ocean circulation in the Pacific side of the Arctic Ocean. Through a complex interaction with declining sea ice, warm water entering the Arctic Ocean through Bering Strait in summer is being shunted from the Alaskan coast into the Arctic Ocean, where it fosters further ice loss.

Many questions still remain to be answered, but these changes in ocean circulation may be important keys for understanding the observed loss of Arctic sea ice.”

Dr. Willie Soon shows a much better correlation of solar TSI and arctic temperatures than CO2.

image
Enlarged

Posted on 09/24 at 01:49 PM
(2) TrackbacksPermalink


Monday, September 18, 2017
Don’t believe the hurricane hype machine

By Michael Sununu, Union Leader

LET US SET hyperbole aside. Climate change is NOT making hurricanes stronger or more frequent.

Statements to the contrary by climate scientists like Michael Mann and Tom Peterson are not based on objective analysis of historical data. Most of the claims by the alarmists are based on hypothetical model projections, an ill-informed understanding of weather, and a desire to ignore the historical precedents which indicate the current environment is no different than the past.

Alarmists claim flooding caused by Harvey was the worst ever. In fact, those areas of coastal Texas have seen similar flooding in the past. In 1978, tropical storm Amelia dumped 48 inches of rain on Texas, and the following year, Claudette inundated the state with 54 inches, with one location in Alvin, Texas, receiving 43 inches in a 24-hour period (a record).

These two storms occurred during the late 1970s’ cooler climate. In 1954, Hurricane Alice dropped huge amounts of rain in the Rio Grande Valley. In 1935, Harris County saw its worst floods, with waters reaching the second and third stories in Houston. Harvey was not unprecedented.

The forces behind these huge rain totals are local weather patterns that stall storm systems over the state. WeatherBell’s chief forecaster, Joe Bastardi, pointed out that a major trough extended south and trapped Harvey just onshore. This allowed the storm to suck up warm moisture from the Gulf and continue to drop it in the same area over several days. Thirty- to forty-inch rain totals are not a common occurrence, but they do occur every decade or so in Texas.

The second claim is that the U.S. is seeing stronger hurricanes than in the past, because climate change is raising hurricane intensity. Nothing could be further from the truth. Looking at the historical records, Roger Pielke Jr. has noted that in the 44-year period between 1926 and 1969, fourteen Category 4 or 5 hurricanes made landfall in the United States. Over the next 47 years, between 1970 and 2017, just four hit the United States, and that includes Harvey this year. We are seeing fewer major hurricanes hit the U.S., not more.

Others have noted that Irma was a very intense hurricane reaching very low pressures, and this is true. Irma reached as low as 914mb and wind speeds of 185 mph. Hurricane Wilma in 2005 holds the record with the lowest pressure in the Atlantic basin at 882mb. When you look at historical storms, Wilma was comparable to the Great 1935 Labor Day hurricane that made landfall with pressure at 892mb, the only storm to make landfall below 900mb. Since there were no Hurricane Hunter aircraft in that era, it is quite possible that the 1935 storm had pressures even lower than Wilma.

There is evidence that the Great Hurricane of 1780, which destroyed much of the eastern Caribbean, may have had windspeeds in excess of 200 mph. It destroyed every house and fort on Barbados, stripped the bark off trees, and killed over 22,000 people in the region.

Clearly, Mother Nature has always been very capable of generating storms far more intense than what we have seen recently.

Climate alarmists don’t want to talk about the uncomfortable fact we just went through an historic 12-year period without a major hurricane (Category 3 or higher) making landfall in the United States. We were blessed for more than a decade without a devastating hurricane wreaking havoc on our shores, so having two hurricanes making landfall this year, while unfortunate, should not be unexpected.

image
Enlarged

image
Enlarged

What is disappointing, unnecessary, and counterproductive is the attempt to somehow link carbon dioxide emissions to hurricane activity. Both the IPCC and NOAA have made clear statements that there is no scientific evidence that links CO2 levels with extreme weather, including tropical systems. Any linkage is purely hypothetical, and is more likely a political rather than a scientific determination.

In the real world, fossil fuels are what make our communities resilient. They provide us the concrete to reinforce our homes, the fuel to help move us away from danger, the materials to preserve and rebuild our infrastructure, and the electricity to bring back our communities when the storm passes. They power the rescue boats and aircraft, help deliver the food and water to stricken communities, and power the chainsaws that allow us to clear the debris.

Carbon dioxide is not powering these storms, but it does make our lives better both before they hit, and after they leave disaster in their wake.

Michael Sununu is a consultant with Sununu Enterprises LLC and lives in Newfields.

Posted on 09/18 at 06:41 AM
(2) TrackbacksPermalink


Sunday, September 17, 2017
Finally, some commonsense western fire policies

New DOI and DOA policy to cut overgrown, diseased, dead and burned trees is long overdue

Paul Driessen

President Trump promised to bring fresh ideas and policies to Washington. Now Interior Secretary Ryan Zinke and Agriculture Secretary Sonny Perdue are doing exactly that in a critically important area: forest management and conflagration prevention. Their actions are informed, courageous and long overdue.

Westerners are delighted, and I’ve advocated such reforms since my days on Capitol Hill in the 1980s.

As of September 12, amid this typically long, hot, dry summer out West, 62 major forest fires are burning in nine states, the National Interagency Fire Center reports. The Interior Department and Ag Department’s Forest Service have already spent over $2 billion fighting them. That’s about what they spent in all of 2015, previously the most costly wildfire season ever, and this season has another month or more to go. The states themselves have spent hundreds of millions more battling these conflagrations.

Millions of acres of forest have disappeared in smoke and flames - 1.1 million in Montana alone. All told, acreage larger than New Jersey has burned already. However, even this hides the real tragedies.

The infernos exterminate wildlife habitats, roast eagle and spotted owl fledglings alive in their nests, immolate wildlife that can’t run fast enough, leave surviving animals to starve for lack of food, and incinerate organic matter and nearly every living creature in the thin soils. They turn trout streams into fish boils, minus the veggies and seasonings. Future downpours and rapid snowmelts bring widespread soil erosion into streambeds. Many areas will not grow trees or recover their biodiversity for decades.

Most horrifically, the conflagrations threaten homes and entire communities. They kill fire fighters and families that cannot get away quickly enough, or get trapped by sudden walls of flames.

In 2012, two huge fires near Fort Collins and Colorado Springs, Colorado burned 610 homes, leaving little more than ashes, chimneys and memories. Tens of thousands of people had to be evacuated through smoke and ash that turned daytime into choking night skies. Four people died. A 1994 fire near Glenwood Springs, CO burned 14 young firefighters to death.

These are not “natural” fires of environmentalist lore, or “ordinary” fires like those that occur in state and privately owned and managed forests. Endless layers of laws, regulations, judicial decrees and guidelines for Interior and Forest Service lands have meant that most western forests have been managed like our 109 million acres of designated wilderness: they are hardly managed at all.

Environmentalists abhor timber cutting on federal lands, especially if trees might feed profit-making sawmills. They would rather see trees burn, than let someone cut them. They constantly file lawsuits to block any cutting, and too many judges are all too happy to support their radical ideas and policies.

Thus, even selective cutting to thin dense stands of timber, or remove trees killed by beetles or fires, is rarely permitted. Even fire fighting and suppression are often allowed only if a fire was clearly caused by arson, careless campers or other human action - but not if lightning ignited it. Then it’s allowed to burn, until a raging inferno is roaring over a ridge toward a rural or suburban community.

The result is easy to predict. Thousands of thin trees grow on acreage that should support just a few hundred full-sized mature trees. Tens of billions of these scrawny trees mix with 6.3 billion dead trees that the Forest Service says still stand in eleven western states. Vast forests are little more than big trees amid closely bunched matchsticks and underbrush, drying out in hot, dry western summers and droughts - waiting for lightning bolts, sparks, untended campfires or arsonists to start super-heated conflagrations.

Flames in average fires along managed forest floors might reach several feet in height and temperatures of 1,472 F (800 C), says Wildfire Today. But under extreme conditions of high winds and western tinderboxes, temperatures can exceed 2,192 F (1,200 C), flame heights can reach 165 feet (50 meters) or more, and fires can generate a critter-roasting 100,000 kilowatts per meter of fire front. Wood will burst into flame at 572 F. Aluminum melts at 1,220 degrees, silver at 1,762 and gold at 1,948 F!

Most of this heat goes upward, but super-high temperatures incinerate soil organisms and organic matter in thin western soils that afterward can support only stunted, spindly trees for decades.

These fires also emit prodigious quantities of carbon dioxide, fine particulates and other pollutants - including mercury, which is absorbed by tree roots from rocks and soils that contain this metal, and then lofted into the sky when the trees burn.

Rabid greens ignore these hard realities - and divert discussions back to their favorite ideological talking points. The problem isn’t too many trees, they insist. It’s global warming and climate change. That’s why western states are having droughts, long fire seasons, and high winds that send flames past fire breaks.

Global warming, global cooling and climate change have been part of the Earth and human experience from time immemorial. Natural climate fluctuations brought the multi-decade Anasazi drought, the Dust Bowl and other dry spells to our western states. To suggest that this summer’s heat and drought are somehow due to mankind’s fossil fuel use and related emissions is deliberately delusional nonsense.

Neither these activists nor anyone in Al Gore’s climate chaos consortium can demonstrate or calibrate a human connection to droughts or fires. Rants, rhetoric and CO2-driven computer models do not suffice. And even if manmade (plant-fertilizing) carbon dioxide does play a role amid the powerful natural forces that have always controlled climate and weather, reducing US fossil fuel use would have zero effect.

China, India, Indonesia and Vietnam alone are building 590 new coal-fired power plants right now, on top of the hundreds they have constructed over the past decade. Overall, more than 1,600 new coal generators are planned or under construction in 62 countries. People in developing countries are also driving far more vehicles and making great strides in improving their health and living standards. They will not stop.

Western conflagrations jump fire breaks because these ferocious fires are fueled by the unprecedented increase in combustibles that radical green policies have created. These monstrous fires generate their own high winds and even mini tornados that carry burning branches high into the air, to be deposited hundreds of feet away, where they ignite new fires. It has nothing to do with climate change.

Remove some of that fuel - and fires won get so big, hot, powerful and destructive. We should also do what a few environmentalist groups have called for: manage more areas around buildings and homes - clearing away brush that federal agencies and these same groups have long demanded be left in place.

Finally, we should be using more of the readily available modern technologies like FireIce from GelTech Solutions. They can suppress and extinguish fires, and protect homes, much better than water alone.

The last bogus eco-activist claim is that “fire isn’t destruction; it’s renewal. It creates stronger, more diverse ecosystems.” That may be true in managed forests, timber stands in less tinder-dry states, and forests that have undergone repeated, non-devastating fires. For all the reason presented above, it is not true for government owned and mismanaged forests in our western states.

Over 50 million acres (equal to Minnesota) are at risk of catastrophic wildfires. Right now, we are spending billions of dollars we don’t have, should not have to spend fighting all these monstrous killer blazes, and should have available to improve forests and parks and fund other vital programs.

These forests could and should create jobs and generate revenues in states where far too many lands, timber, oil and minerals have been placed off limits - primarily by urban politicians, judges and radical activists who seem determined to drive people off these western lands, turn them into playgrounds for the wealthy, and roll back other Americans’ living standards and well-being. Cleaning out dead, diseased, burned, overgrown trees would bring countless benefits. It would make our forests healthy again.

Above all, the new Interior-Agriculture approach would demonstrate that Rural Lives Matter.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (http://www.CFACT.org), and author of Eco-Imperialism: Green power - Black death and other books on the environment.

----------

Chris Mooney, WAPO’s alarmist Science editor asked for a comment about my being considered for the Science Advisory Board of Pruit’s EPA. 

When I was working on a doctoral traineeship grant at NYU in Air Resources in the 1970s, we had real air and water pollution issues. We as a nation with the EPA in the early days did a commendable job cleaning up up our air and water with reasonable clean up measure on energy plants and automobiles. We have more than met our goal in reducing levels of criteria pollutants below the reasonable standards set years ago. Carbon pollution (particulates) are a problem in China and India but we in the U.S. have reduced particulate levels 50% the last few decades and are now well below the aggressive standards we set.  We rarely see air pollution advisories today, something very common decades ago.

This hyper focus on controlling CO2 in recent decades is immoral - harmful to our nation, its’ people and their future.  CO2 is a beneficial trace gas (0.04% by volume). It is a plant fertilizer that has helped us increase crop yields 3 to 5 fold. The Endangerment Finding being used by the EPA (and courts) to regulate CO2 emissions has been invalidated by real world data and needs to be scrapped and redone using good data and science and not failed models. Scientists and econometricians I worked with have done sold research reports that show natural factors are responsible for all the cyclical changes and claims about changes and extremes are not supportable. 

Europe, Australia and the green agenda states here including California and the northeast RGGI states which pushed the green agenda, are paying the price with electricity costs 2 to up to 6 times higher than we pay in most other states. This affects the poor and middle class the most (a hidden tax) and drives out industry which costs jobs (Spain reach 27.5% unemployment before they stopped subsides and lost 4 real jobs for every temporary green job created). More than 25% of Britons, many of them pensioners, are in what is called energy poverty, having to choose between heating and eating. In Europe and Australia bad policies have led to power blackouts or brownouts and a rush to build coal plants to make up for the intermittent and unreliable wind and solar output.

There is still a need for environmental protection. We need to refocus the EPA on ensuring there is no more Animas River or Flint water events. We need to help deal with the issues of mold and water pollution in areas where heavy flooding like we saw with Harvey and Irma occur. There is work to be done i waste management and ensuring ground water is safe, We need to work with other government departments to develop more sane forest management that reduces wildfire risks with their smoke pollution and reasonable spring and summer streamflow water runoff usage policies to benefit agriculture and the cities that need clean water.

Note I had alluded to the need to deal with the forestry and water supply issues. Reporters don’t like when you send a response in writing as that allows them to paraphrase your response in a way that aligns with their message. He quickly called to throw other questions about me. i will report how it goes. A similar thing happened last week with an Environmental newsletter trying to bash those candidate scientists that could affect their green agenda with some out of context and false quotes. Even the Washington Examiner a usually fair and balanced source of information had an article with several inaccuracies. The author on Linkedin pointed to Richard Branson and Arianna Huffington as influential interests which explains a lot. Friends the fake science reporters are everywhere.

Posted on 09/17 at 10:28 AM
(0) TrackbacksPermalink


Thursday, September 14, 2017
A Global Warming Red Team Warning: Do Not Strive For Consensus With The Blue Team

Dr. Roy Spencer

Now that the idea of a global warming Red Team approach to help determine what our energy policy should be is gaining traction, it is important that we understand what that means to some of us who have been advocating it for over 10 years - and also what it doesn’t mean.

The Red Team approach has been used for many years in private industry, DoD, and the intelligence community to examine very costly decisions and programs in a purposely adversarial way...to ask, what if we are wrong about a certain program or policy change? What might the unintended consequences be?

In such a discussion we must make sure that we do not conflate the consensus on a scientific theory with the need to change energy policy, as is often done. (Just because we know that car wrecks in the U.S. cause 40,000 deaths a year doesn’t mean we should outlaw cars; and I doubt human-caused climate change has ever killed anyone).

While science can help guide policy, it certainly does not dictate it.

In the case of global warming and the role of our carbon dioxide emissions, the debate has too long been dominated by a myopic view that asserts the following 5 general points as indisputable. I have ordered them generally from scientific to economic.

1) global warming is occurring, will continue to occur, and will have dangerous consequences

2) the warming is mostly, if not totally, caused by our CO2 emissions

3) there are no benefits to our CO2 emissions, either direct (biological) or indirect (economic)

4) we can reduce our CO2 emissions to a level that we avoid a substantial amount of the expected damage

5) the cost of reducing CO2 emissions is low enough to make it worthwhile (e.g. mandating much more wind, solar, etc.)

ALL of these 5 points must be essentially true for things like the Paris Agreement (which President Trump has now withdrawn us from...for the time being) to make much sense.

But I would argue that each of the five points can be challenged, and not just with “fake science”. There is peer-reviewed and published analysis in science and economics that would allow one to contest each one of the five claims.

The Red Team Approach: It’s NOT a Redo of the Blue Team

John Christy and I are concerned that the Red Team approach, if applied to global warming, will simply be a review of the U.N. IPCC science on global warming. We are worried that it will only address the first two points (warming will continue, and it is mostly caused by CO2). Heck, even *I* believe we will continue to see modest warming, and that it might well be at least 50% due to CO2.

But a Red Team reaffirming those points does NOT mean we should “do something” about global warming.

To fully address whether we should, say, have regulations to reduce CO2 emissions, the Red Team must address all 5 of the “consensus” claims listed above, because that is the only way to determine if we should change energy policy in a direction different from that which the free market would carry it naturally.

The Red Team MUST address the benefits of more CO2 to global agriculture, “global greening” etc.

The Red Team MUST address whether forced reductions in CO2 emissions will cause even a measurable effect on global temperatures.

The Red Team MUST address whether the reduction in prosperity and increase in energy poverty are permissible consequences of forced emissions reductions to achieve (potentially unmeasurable) results.

The membership of the Red Team will basically determine the Team’s conclusions. It must be made up of adversaries to the Blue Team “consensus”, which has basically been the U.N. IPCC. If it is not adversarial in membership and in mission, it will not be a real Red Team.

As a result, the Red Team must not be allowed to be controlled by the usual IPCC-affiliated participants.

Only then can its report can be considered to be an independent, adversarial analysis to be considered along with the IPCC report (and other non-IPCC reports) to help guide U.S. energy policy.

Posted on 09/14 at 08:01 AM
(1) TrackbacksPermalink


Tuesday, September 05, 2017
Powerful CAT5 IRMA a major threat next to Florida

By Joseph D’Aleo, CCM, AMS Fellow

A look back on Irma and Harvey :

Update: irma will make landfall in south Florida Sunday as a major hurricane. Models which had teased a track along the eastern parts of the state are talking aim at the keys and then the entire peninsula.

image
Enlarged

In its devastating journey through the northern Leeward Islands, Virgin Islands as a CAT 5 storm with winds estimated at 185mph, it ranked second for wind and 10 for lowest pressures.

image
Enlarged

image
Enlarged

It has weakened to a CAT 4 with winds of 155mph and central pressure of 925mb. After moving along the Cuba coast, it will turn north and likely intensify again.

Join us on Weatherbell.com to see all the details in posts and videos.

-------------------------

The name Irma is derived from the German ‘Ermen’ which mean ‘whole’. You can drop the ‘w’, in terms of pressure it has become a literal ‘hole’ in the atmosphere.  Irma has intensified to a powerful very dangerous CAT5 hurricane with a central pressure of 929mb and winds of 175mph.

Here is the latest water vapor image.

image
Enlarged

Here is the projected 5 day tracking skirting the islands on its way west.

image
Enlarged

It is forced west by a strong high pressure in the Atlantic, ‘king ridge’ as my meteorological buddy Al Lipson calls it. It won’t make its turn north (something all tropical storms look to do at first opportunity) until the ridge weakens or the storm reaches the western end of the ridge. This is the first and often the biggest challenge in hurricane forecasting.

image
Enlarged

See also the cold trough to the north in the Great Lakes into the northeast. See how that cold trough north moves out into the Atlantic and the ridge collapses this weekend.

image
Enlarged

See how in the lower levels the trough to the north is a cold one, the low above Irma is warm core. At 18,000 feet it is colder than -20C in the northern trough but more than +3C over Irma. Above freezing temperatures at 18,000 are confined to the strongest heat waves and the stronger hurricanes.

image
Enlarged

See the US GFS model take Irma into Florida with pressures in the 880s mb. The strongest Atlantic Basin storm was Wilma in 2005 with 882mb central pressure in the Gulf. It weakened a bit to CAT3 when it slammed Florida.

image
Enlarged

image
Enlarged

See the ensemble members of the GFS mainly target Florida. It should be noted some models and ensemble members of the other models turn north before Florida and others take the storm west of Florida before turning north.

Did you ever miss an exit on the highway or the street you wanted to turn on. You have to go to the next exit or street that is one way the right way. Do you remember Katrina which was rolling southwestward and faster than the models which had it turning just west of the Florida towards Apalachicola. It went out into the central Gulf before turning and the rest was history.

image
Enlarged

Just like Harvey ended the record major hurricane nearly 12 year drought for the U.S., it it makes landfall in Florida it will end the record hurricane drought also since Wilma in 2005.

image
Enlarged

Please take this storm very seriously and consider your options and follow local emergency management directives. Join us at Weatherbell.com where our team is doing frequent updates and videos.

Posted on 09/05 at 08:13 AM
(1) TrackbacksPermalink


Monday, September 04, 2017
Roger Pielke Jr.: “The Hurricane Lull Couldn’t Last”

COMMENTARY
The Hurricane Lull Couldn’t Last

image

The U.S. hadn’t been hit by a Category 3 or stronger storm since Katrina in 2005. We were overdue.
By Roger Pielke Jr.  Aug. 31, 2017 7:09 p.m. ET

Activists, journalists and scientists have pounced on the still-unfolding disaster in Houston and along the Gulf Coast in an attempt to focus the policy discussion narrowly on climate change. Such single-issue myopia takes precious attention away from policies that could improve our ability to prepare for and respond to disasters. More thoughtful and effective disaster policies are needed because the future will bring many more weather disasters like Hurricane Harvey, with larger impacts than those of the recent past.

For many years, those seeking to justify carbon restrictions argued that hurricanes had become more common and intense. That hasn’t happened. Scientific assessments, including those of the Intergovernmental Panel on Climate Change and the U.S. government’s latest National Climate Assessment, indicate no long-term increases in the frequency or strength of hurricanes in the U.S. Neither has there been an increase in floods, droughts and tornadoes, though heat waves (Icecap Note: not really) and heavy precipitation have become more common.

Prior to Harvey, which made landfall as a Category 4 storm, the U.S. had gone a remarkable 12 years without being hit by a hurricane of Category 3 strength or stronger. Since 1970 the U.S. has only seen four hurricanes of Category 4 or 5 strength. In the previous 47 years, the country was struck by 14 such storms. President Obama presided over the lowest rate of hurricane landfalls -0.5 a year - of any president since at least 1900. Eight presidents dealt with more than two a year, but George W. Bush (18 storms) is the only one to have done so since Lyndon B. Johnson. The rest occurred before 1960.

Without data to support their wilder claims, climate partisans have now resorted to shouting that every extreme weather event was somehow “made worse” by the emission of greenhouse gases. Earlier this week, New York Times columnist David Leonhardt directed researchers “to shed some of the fussy over-precision about the relationship between climate change and weather.”

[...]

Wall Street Journal

And some bad news for the alarmists…

image
Gulf of Mexico operators returning to work after Harvey

09/01/2017

Offshore staff

NEW ORLEANS - About 9% of oil production and 13% of natural gas production remains shut-in in the Gulf of Mexico, according to the Bureau of Safety and Environmental Enforcement.

[...]

BSEE added that no damage reports from oil and gas operators have been received.

[...]

As offshore oil and gas operations return to normal, the industry continues to provide assistance for the onshore Hurricane Harvey relief efforts.

Hess Corp. has donated $1 million to the Hurricane Harvey Relief Fund. The company said that it will match every donation employees make in the coming weeks to relief efforts by the Hurricane Harvey Relief Fund, American Red Cross, and United Way of Houston.

Transocean says that it has contributed $100,000 to the American Red Cross and $100,000 to the Houston Food Bank. The company says that it will also match donations made to the relief efforts by its employees.

Statoil announced via social media that it has donated $250,000 to the Red Cross.

Weatherford International plc says that is has pledged $25,000 to Feeding Texas, the Texas Food Bank Network, and $25,000 to J.J. Watt’s Hurricane Harvey Relief Fund.

ExxonMobil says that it has increased its financial commitment for Harvey relief to up to $9.5 million, which includes a new employee and retiree donation match program and in-kind donations to the American Red Cross for recovery efforts in South Texas. The increased support builds on $1 million in previous contributions to the American Red Cross and United Way of Greater Houston.

Offshore Magazineimage

Posted on 09/04 at 04:52 PM
(1) TrackbacksPermalink


Wednesday, August 30, 2017
Harvey makes second landfall and heads northeast

By Joseph D’Aleo, WeatherBELL Analytics, LLC

TS Harvey’s center crossed the coast just west of Cameron, Louisiana, with most of the associated deep convection located over extreme southeastern Texas and western Louisiana early this morning.

image
Enlarged

NHC reports at 4am CDT - Although the rain has ended in the Houston/Galveston area, the Beaumont/Port Arthur area was particularly hard hit overnight, with about 12.5 inches reported at the Jack Brooks Regional Airport since 7 pm CDT

See the large area of 20” rains in southeast Texas to southwest Louisiana.

image
Enlarged

See 1 gauge reported 51.88”.  Harris county maintains a network of 156 reported rainfall gauges and stream gauges. See most of the amounts in the city were in the 30 to low 40 inch range. The heaviest was in the southeast.

image
Enlarged

39 gauges were in major flood, 11 more moderate flooding.

image
Enlarged

As Dr Spencer pointed out, There have been many flood disasters in the Houston area, even dating to the mid-1800s when the population was very low. In December of 1935 a massive flood occurred in the downtown area as the water level height measured at Buffalo Bayou in Houston topped out at 54.4 feet.

image
Enlarged

The Buffalo Bayou gauge topped out with Harvey at around 39 feet on the 28th and dropped a bit, recovered to 37 feet then has been receding.

image
Enlarged

The three stations that exceeded the 48 inch record set in 1978 in Amelia put Harvey the top of the list of tropical rainmakers in the lower 48 states as WeatherBELL predicted several days ago. 6 of the 10 storms were in Texas where storms often get trapped in summer or fall.

image
Enlarged

Here was Amelia in 1978 with the 48 inch total for 7 days at Medina. The gauge density there obviously was less and we can’t know with certainty if there was more with that and other storms on the list.

image
Enlarged

See the storm when it came ashore as a CAT4.

image
Enlarged

It was the first landfalling hurricane and major hurricane this decade in Texas. The last major was Bret in 1999. Rita and Ike came close in 2004 and 2008.

image
Enlarged

It was tied for 14th place by pressure (Klotzbach)

image
Enlarged

We have with Harvey had 7 landfalling hurricanes this decade. We have to have 8 more the rest of this season and in 2018 and 2019 to keep this decade from being the quietest on record.

image
Enlarged

Posted on 08/30 at 07:53 AM
(2) TrackbacksPermalink


Saturday, August 12, 2017
Why Climate Alarmist Reports Should Be Ignored Where They Use Bad Methodology and Data

Looks like the public agrees with Alan.

Scott Rasmussen’s Number of the Day
By Scott Rasmussen

August 21, 2017: Twenty-eight percent (28%) of Americans think that climate scientists understand the causes of global climate change “very well.” A Pew Research study found that only 19% believe that the climate scientists have a very good understanding of the best ways to address the issue.

In general, the study found that Americans trust climate scientists more than politicians on the topic. Two-thirds (67%) believe scientists should play a major role in addressing policy issues on the matter. Most (56%) also believe that energy industry leaders (56%) and the general public (56%) should have a major say in such policy topics.

The Pew study, however, also found that people believe there are differences of opinion among the climate scientists. Only 27% believe that there is a consensus on the issue and that just about all climate scientists believe human behavior is mostly responsible for global climate change. Another 35% think more than half hold this view.

The survey also explored the degree of trust and confidence in those researching climate science. Thirty-six percent (36%) believe that, most of the time, scientists’ research findings are motivated by a desire to advance their own careers. Only 32% say that they mostly rely on the best scientific evidence. Twenty-seven percent (27%) believe that political views of the scientists generally influence their work.

Liberal Democrats tend to express high levels of confidence in the climate scientists and their motives. Conservative Republicans are often quite skeptical. Most other Americans have mixed views.

----------

Alan Carlin August 9, 2017

Like other liberal news outlets, the New York Times has been busy printing unapproved internal Trump Administration material this year. On August 8, 2017 they printed a Draft Report as part of a new National Climate Assessment. It was prepared primarily during the Obama Administration by a Federal inter-agency group and is still residing on an outside server from an earlier public comment period. They concluded, among other things, that “Many lines of evidence demonstrate that human activities, especially emissions of greenhouse (heat-trapping) gases, are primarily responsible for recent observed climate change.”

The problem is not that the viewpoints expressed are new or useful or that the draft was not already available; rather they represent a rather tired repetition of the usual climate alarmist ideology with only occasional updates. This is unfortunate since it is becoming ever clearer that the ideology has become scientifically indefensible and needs to be abandoned in favor of a new approach to climate science.

Perhaps the Most Basic Problem

Perhaps the most basic problem with this Draft Report, like most of the major Climate Industrial Complex (CIC) reports, is that it primarily depends for its justification on the IPCC’s bottom-up global climate models (as they discuss in Section 4.3 of the Draft Report). The Draft Report shows that the climate alarmists have by no means given up their horrifically expensive and misguided crusade to reduce carbon dioxide (CO2) emissions, despite that the alarmists’ very extensive attempt to justify it is hopeless.

Not only is their conclusion that global warming is primarily due to human activity, but also that temperatures will increase significantly because of increases in anthropogenic atmospheric CO2. Their basic methodology is based on the UN Intergovernmental Panel on Climate Change’s (IPCC’s) analyses conducted over many years. The Heartland Institute has gone to great effort to point out many of the problems and inconsistencies in the conclusions reached using these models. But it is increasingly clear why the IPCC has been having a hard time explaining the increasing divergence between their models and actual temperatures. One of the basic problems is that alarmists have always used a bottom-up approach in their methodology (which is to aggregate the results for individual geographic areas based on the application of subjective physical relationships between various physical effects). This approach cannot produce valid results no matter how much is spent on it, how often it is repeated, or how large the climate models they use. As Mike Jonas has recently written:

In this very uncertain world of climate, one thing is just about certain: No bottom-up computer model will ever be able to predict climate. We learned above [in the article this was excerpted from] that there isn’t enough computer power now even to model GCRs [galactic cosmic rays], let alone all the other climate factors. But the issue of computer model ability goes way beyond that. In a complex non-linear system like climate, there are squillions of situations where the outcome is indeterminate. That’s because the same influence can give very different results in slightly different conditions. Because we can never predict the conditions accurately enough - in fact we can’t even know what all the conditions are right now - our bottom-up climate models can never ever predict the future. And the climate models that provide guidance to governments are all bottom-up.

The bottom-up GCM was a bad approach from the start and should never have been paid for by the taxpayers. All that we have are computer models that were designed and then tuned to lead to the IPCC’s desired answers and have had a difficult time even doing that.

So not only are the results claiming that global temperatures are largely determined by atmospheric CO2 wrong, but the basic methodology is useless. Climate is a coupled, non-linear chaotic system, and the IPCC agrees that this is the case. It cannot be usefully modeled by using necessarily limited models which assume the opposite.

An Entirely New Approach Is Needed

Despite repeated claims by climate alarmists that climate science is settled, nothing could be further from the case. In fact, an entirely new approach is needed if much progress is to be made in characterizing and understanding the climate system. This approach must be a top-down rather than a bottom-up approach. To my knowledge, only one such study (and earlier versions thereof) exists taking this approach, which I will call the 2017 WCD report after the authors’ last names. And it appears to give plausible results. It says that CO2 does not have a significant effect on global temperatures and that global temperatures can be fully explained since about 1960 by entirely natural factors and do not require any human activity to explain what has occurred. This rules out many if not most of the Draft Report’s conclusions.

A second very recent report including two of the same authors as WCD 2017 concludes that the keepers of the official global surface temperature records have repeatedly “adjusted” them to the point that they are no longer representative of the underlying data. Accordingly, the authors argue that the data used in the Draft Report from surface temperature sources and the conclusions reached from using this data are too unreliable for policy use.

The Time Has Come to Abandon the IPCC’s Bottom-up Approach and Correct the Basic Data Used Before Further Expenditures Are Made

It is time to totally abandon the IPCC’s bottom-up climate models as an ultra expensive sunk cost and start over. The 2017 WCD report would be a good place to start in redoing the basic climate analyses. Until this is done, little progress is possible in many of the major issues in climate science, and no further expenditures should be made responding to climate alarmism until the new methodology has been thoroughly tested and the basic surface temperature data has been reconstituted in a useful form. The mistaken choice of methodology has ended up costing taxpayers tens of billions in research costs and has reportedly resulted in about $1.5 trillion per year for renewable and related construction, which needs to be written off too.

I recommend that the Trump Administration issue the Draft Report with an added section explaining how useless and biased the rest of the Draft Report is because it primarily relies on meaningless model results and unreliable surface temperature data. If such a combined report were issued it would be one of the first government reports anywhere to seriously question the IPCC’s results, and has long been needed. Scientific hypotheses and data that have never been rigorously tested are not fit to be used for public policy purposes, and particularly for those involving multi-trillion expenditures per year.

Posted on 08/12 at 03:20 PM
(1) TrackbacksPermalink


Friday, August 04, 2017
Bill Nye: The Real Message We Should Pay Attention To

By Joe Bastardi, Patriot Post

There was a minor uproar over a recent Bill Nye comment that is summed up in this article: ”Bill Nye: Climate Change Scientists Need To Wait For Older People To Die.”

image

But let’s look at this for what it really reveals.

First of all, Bill is stating a fact. Many in the “resistance” to climate change are Bill’s age and older. But this generation was brought up differently than the current group of people, many rising through academia. We were taught to question authority. We were also encouraged to reject groupthink. Perhaps it had to do with Eisenhower’s farewell speech where a warning against the military-industrial complex - which when I was growing up in the ‘60s and ‘70s was being used as a rallying cry against our involvement in Vietnam - was a big deal. But those of us in our formative years then who are now in the generation that Bill is talking about also took note of the other part of Eisenhower’s speech.

Let me borrow from this Wikipedia link. The legacy of the speech from the article above attests to my assertion about its importance in relation to Vietnam:

Although it was much broader, Eisenhower’s speech is remembered primarily for its reference to the military-industrial complex. The phrase gained acceptance during the Vietnam War era and 21st-century commentators have expressed the opinion that a number of the fears raised in his speech have come true.

The part referenced in particular was this:

In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist. We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted. Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.

Here is one of the greatest generals of our nation warning against the military-industrial complex, and many took it to heart.

But we of that generation also knew about the second part of his warning.

Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

In this revolution, research has become central, it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government....

The prospect of domination of the nation’s scholars by Federal employment, project allocation, and the power of money is ever present and is gravely to be regarded.

Yet in holding scientific discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.

Ike was right.

Here is where Nye is correct. He understands that the people who were brought up in the form of Americanism and who believed the individual should question authority are getting older and will not be around when the new vanguard takes over. Enlightened he believes himself to be, and I suspect others like Al Gore think they are simply leading the new wave to replace the old wave. But instead of attacking Nye and making it seem like he has a death wish for his opponents, why don’t people actually look at the facts of what he is saying and what that actually means for things like critical thought and skepticism? Those things are essential not only to the scientific method but also for the basis for man to use his free will to better himself. The bottom line is that Nye’s statement does not identify a problem with Nye, it identifies a problem with what has happened over the post-Vietnam generations. Nye, and the climate issue looked at deeper, reveal a deeper problem that strikes at the core of what has lead to the elevation of the nation to where we are.

Bottom line: Nye is right about the inevitable result. It does not make him right about CO2 being the climate control knob. But Nye is not the demon here; he is more a messenger of the very changes that Eisenhower warned us about in his speech. And what is apparent is that the generation which followed that speech took his word to heart over one thing but went the opposite way on another.

Some may be tempted to think I am going soft on Nye. I am evaluating what he said in an objective manner. I would suggest instead of tearing at the messenger we look at the message. For in his message is the real danger not to the people that are aging but to the very methodology essential for those who follow to continue to build upon the successes that solid foundational skepticism and freedom make possible.

Joe Bastardi is chief forecaster at WeatherBELL Analytics, a meteorological consulting firm, and contributor to The Patriot Post on environmental issues.


Posted on 08/04 at 11:14 AM
(1) TrackbacksPermalink


Tuesday, August 01, 2017
Press Release: New Research Report Confirms Invalidation of the EPA’s Endangerment Finding

By Joseph D’Aleo, CCM, AMS Fellow

Note that our press release and research papers hit a raw nerve with alarmists who did not really meaningfully argue the data or science but challenged the claim it was ‘peer reviewed’ (in the sense they have defined and controlled the peer review process through what have become advocacy journals).

The traditional peer review often can be compared to the TSA boarding card approval process. If you are not on any no fly list, after a review of a photo ID and/or passport, you get a stamp and move on. With journal submissions, if you have the politically correct credentials, you get the stamp and are published. The journal ‘no-fly’ list it appears includes those deemed to be skeptics. For the review, there is often at least one reviewer ‘gatekeeper’ responsible for ensuring you don’t board the plane or in this case get your paper in the journal no matter how impressive your CV and content-rich your paper.

Our research reports are not traditional journal articles. The reports follow the approach used frequently in industry often for their own internal use. They were prepared by highly qualified authors using the best available data and understanding of the scientific factors, analyzed properly by the very best statisticians/econometricians. The reviewers who endorsed them are chosen to be highly qualified to evaluate the work. The individuals quoted in fact checker reviews like on Snopes are simply enforcers of orthodoxy, unable or unwilling to understand the processes applied and the science.

You must know that the traditional journal peer review process is broken. It is failing in a lack of robustness in the statistical analysis and widespread inability to replicate results. This is true in both the Medical and Scientific areas.  See examples here and here.

This story in Forbes by Henry Miller says “A number of empirical studies show that 80-90% of the claims coming from supposedly scientific studies in major journals fail to replicate”.

Another recent paper in Nature showed 70% of the papers in medical journals had studies that could not be replicated, many not even by the original authors. See an example of one such falsified report that the author worries is a part of an epidemic of agenda-driven science by press release and falsification that has reached crisis proportions here.

Other reports show an alarming number of papers having to be retracted. For example here Springer is retracting 107 papers from one journal after discovering they had been accepted with fake peer reviews.

Result-oriented corruption of peer review in climate science was proven by the Climategate emails.

In the journals, there are a small set of gatekeepers that block anything that goes against the editorial biases of the journals. Conversely, these journals and their reviewers do not provide a thorough due diligence review of those that they tend to agree with ideologically.  They are engaged in orthodoxy enforcement.

Indeed, Henry Miller wrote: “Another worrisome trend is the increasing publication of the results of flawed ‘advocacy research’ that is actually designed to give a false result that provides propaganda value for activists and can be cited long after the findings have been discredited.” A prime example of this is the hideously flawed but endlessly repeated “97% of climate scientists” paper by Cook and Lewandowski. EPA’s own Inspector General found that EPA’s Endangerment Finding was never properly reviewed, yet it is the basis of all EPA GHG regulations that imposed hundreds of billions in costs on the U.S. economy.

The scientific method requires the data used be made available and the work must be capable of being replicated. This should be required of all journals (in virtually all cases, as shown above, it is not).  Peer review has become pal review with gatekeepers that prevent alternate unbiased data analyses and presentation but rush new papers that support their ideology or view on the science.

In our research reports, we identify the reviewers, who have lent their names to the conclusion, and provide full access to the data for others to work with and either refute replicate, with and instructions on the analytical methods used.

Our team chose to apply the same research report procedures used in industry, which is to assemble the most qualified authors with the skills required to compile the data and rigorously perform the correct analysis. They draft a report and share the draft with a team of experts chosen for their expertise in this field to provide feedback. Almost no journals require that and their failure and rejection numbers speak for themselves.

Wegman et. al suggested one of the common failures in climate papers is the lack of necessary statistical expertise. For our research reports we assembled the highest qualified data experts, econometricians/statisticians and meteorologists/climatologists to draft the research project, do the rigorous statistical/econometric analyses, and then submitted their work to the best qualified scientists/econometricians for review. Attempts to discredit this report are now of course being made because it raises critically important questions about the quality and trustworthiness of the global surface temperature data sets.

The facts and statistical reasoning of this paper cannot be refuted merely by carping peer review. Instead, demonstration of a factual or logical error is required.

----------

Original Post:

This release and research study was covered by Michael Bastasch in the Daily Caller and picked up by Drudge.

Stated simply, our new research findings building on the previous work, totally debunks EPA’s claim that CO2 is a pollutant that must therefore be regulated. It does so by very clearly demonstrating the “Global Average Surface Temperature (GAST) “ data, quoted all the time as setting new surface temperature records, have been purposefully adjusted in a manner such that they are now basically meaningless numbers. Continued reliance on this manipulated GAST data is supporting CO2 regulatory actions that very negatively impact the poor not only in the U.S., but worldwide. There is no scientific basis for this widespread regulation.

PRESS RELEASE

On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data & The Validity of EPA’s CO2 Endangerment Finding Abridged Research Report June 2017

New Research Report

Just released: A peer reviewed Climate Science Research Report has proven that it is all but certain that EPA’s basic claim that CO2 is a pollutant is totally false. All research was done pro bono.

The objective of this research was to test the hypothesis that Global Average Surface Temperature (GAST) data are sufficiently credible estimates of global average temperatures such that they can be relied upon for climate modeling and policy analysis purposes. The relevance of this research is that the validity of EPA’s CO2 Endangerment Finding requires GAST data to be a valid representation of reality.

In this research report past changes in the previously reported historical data are quantified. It was found that each new version of GAST has nearly always exhibited a steeper warming linear trend over its entire history. And, it was nearly always accomplished by each entity systematically removing the previously existing cyclical temperature pattern. This was true for all three entities providing GAST data measurement, NOAA, NASA and Hadley CRU.

As a result, this research sought to validate the current estimates of GAST using the best available relevant data. The conclusive findings were that the three GAST data sets are not a valid representation of reality. In fact, the magnitude of their historical data adjustments which removed their cyclical temperature patterns are totally inconsistent with published and credible U.S.and other temperature data.

Thus, despite current claims of record setting warming, it is impossible to conclude from the NOAA, NASA and Hadley CRU GAST data sets that recent years have been the warmest ever.

Finally, since GAST data set validity is a necessary condition for EPA’s CO2 Endangerment Finding, it too is invalidated by these research findings. This means that EPA’s 2009 claim that CO2 is a pollutant has been decisively invalidated by this research.

---------

The press release and research report was covered in the Daily Caller today here.  In the story Michael Bastasch writes:

Sam Kazman, an attorney with the Competitive Enterprise Institute (CEI), said the study added an “important new piece of evidence to this debate” over whether to reopen the endangerment finding. CEI petitioned EPA to reopen the endangerment finding in February.

“I think this adds a very strong new element to it,” Kazman told TheDCNF. “It’s enough reason to open things formally and open public comment on the charges we make.”

Since President Donald Trump ordered EPA Administrator Scott Pruitt to review the Clean Power Plan, there’s been speculation the administration would reopen the endangerment finding to new scrutiny.

---------

Icecap Note: One of the reasons the temperature measures are flawed in the great uncertainties involved. See details here. See more detail here and an earlier very detailed working document here.

NCDC Climate Director Tom Karl whose paper in 1988 defined the UHI adjustment for the first version of USHCN wrote with Kukla and Gavin in a 1986 paper on Urban Warming:
“… the urban growth inhomogeneity is serious and must be taken into account when assessing the reliability of temperature records.” Inexplicably, the UHI adjustment Karl argued for was removed in USHCNv2. Many of us believe the global warming depicted is largely urban warming as urban heat is blended into the more representative rural station data through “homogenization”. 

Recall this was the third Research Report in the Series - the first two research efforts (see link here) set out to test for the Existence of a “Tropical Hot Spot” and the Validity of EPA’s CO2 Endangerment Finding. Both dealt carefully and properly with econometric simultaneous equation parameter estimation issues in the two separate structural analyses that were carried out. And, both efforts involved the same three authors. Each analyzed the same Tropical, Contiguous U.S. and Global Temperature data sets.

“The objective of this research was to determine whether or not a straightforward application of the “proper mathematical methods” would support EPA’s basic claim that CO2 is a pollutant. Stated simply, their claim is that GAST is primarily a function of four explanatory variables: Atmospheric CO2 Levels (CO2), Solar Activity (SA), Volcanic Activity (VA), and a coupled ocean-atmosphere phenomenon called the El Nino-Southern Oscillation (ENSO.)” This research failed to find that the steadily rising Atmospheric CO2 Concentrations have had a statistically significant impact on any of the 14 temperature data sets that were analyzed. The tropospheric and surface temperature data measurements that were analyzed were taken by many different entities using balloons, satellites, buoys and various land based techniques. Needless to say, if regardless of data source, the analysis results are the same, the analysis findings should be considered highly credible.

The bottom line is the failure of the real world data to support the EPA’s 3 lines of evidence in the Endangerment Finding invalidates it and all regulations which are imposed based on it.

image
‘http://dailycaller.com/2017/07/05/exclusive-study-finds-temperature-adjustments-account-for-nearly-all-of-the-warming-in-climate-data/

Here is the actual global data that gets incorporated into the models that are run 4 times daily. There are no adjustments. The data is based on 6 hourly forecasts adjusted for new observations. We are coming off the El Nino warm period.

image
Enlarged

See how since 2005, warm spikes have occurred with El Nions and dips with La Ninas.

image
Enlarged

See video post from WeatherBELL this Sunday showing summer heat has been declining for decades!!

See this post that shows how GISS has been manipulated.

Posted on 08/01 at 01:54 AM
(1) TrackbacksPermalink


Monday, July 31, 2017
NOAA : Third Fakest June On Record

By Tony Heller, the Deplorable Climate Science Blog

image
Enlarged

This included record heat in South Sudan and the Central African Republic and a very hot Eurasia.

image
Global Climate Report - June 2017 | State of the Climate | National Centers for Environmental Information (NCEI)
Enlarged

Never mind that they don’t actually have any thermometer readings in South Sudan and the Central African Republic. “Gray areas represent missing data.” And never mind their actual thermometer readings in Eurasia showed a significant percentage of below normal temperatures.

image
Enlarged

And never mind that satellites actually measured temperatures in South Sudan and the Central African Republic and found that they were just about average.

image
RSS / MSU Data Images / Monthly
Enlarged

image
UAH MSU June Anomaly
Enlarged

NOAA still believes they are tasked with keeping the global warming scam alive, so they make up fake data to bump temperatures up. Then they get Zeke to tell the press that they actually adjust temperatures down.

It is time to drain the climate swamp.

See also Adjusting Measurements to Match the Models Part 1: Surface Air Temperatures by Roger Andrews

Posted on 07/31 at 06:09 AM
(1) TrackbacksPermalink


Sunday, July 02, 2017
The Santer Clause

Guest Post by John McLean

When the IPCC’s in a hole and doesn’t have a paper to cite, who’s it gonna call?

(All together) BEN SANTER!

image

Santer, Wrigley and others, including several IPCC authors, fixed it for the 1995 report with a “miracle” last-minute paper that claimed to have solid evidence of the human influence on climate. The paper had been submitted and not even reached the stage of review when it was included in the IPCC report. At the instigation of the IPCC Working Group I head, John Houghton, the whole pivotal chapter was revised to accommodate it. And all this happened after the second expert review but before government representatives got together to decide what should be said.

About 18 months later the paper was finally published, citing the IPCC report that cited it, and was laughed off the stage. Never mind. It had served its purpose of manipulating opinion about manmade warming and convincing the new-formed UNFCCC that it didn’t need its own subsidiary organization to fiddle science to support the UNFCCC’s claims; the IPCC was perfectly capable of doing that.

Roll forward about 20 years. The IPCC’s 2013 report showed (text box 9.2) that climate models were rubbish at predicting average global temperatures with 111 of 114 climate model runs predicting, for 1998 to 2012, greater warming than the HadCRUT4 temperature data indicated, which was in fact statistically indistinguishable from zero.

What 5AR didn’t make clear was that climate models are run with and with greenhouses gases and the IPCC blames the difference in the two sets of output on manmade warming. (It’s a completely specious argument unless it can be proven that climate models are 100% accurate when it comes to algorithmically including every climate forcing, which of course they are not. The comparison study in fact shows nothing more than the sensitivity of the models to the inclusion of greenhouse gases.)

With climate models poor at making prediction it also follows that they are poor at estimating the influence of greenhouse gases on climate. If the public becomes aware of this then the ground is cut from beneath the UNFCCC’s claims, which means the Paris Climate Agreement will be seen as the farce it really is and all that rearrangement of the global economy to suit UN socialists won’t take place.

There is simply no way that IPCC 6AR can be allowed to continue to cast doubt on climate models because it might mean that end of both the IPCC and UNFCCC, not to mention the incomes and reputations of so-called climate science experts taking a sharp nose-dive.

So who’s the IPCC gonna call? Ben Santer!

This time around the paper has been published so that it complies with rules set down after the 1995 fiasco and can be cited. Being published of course doesn’t mean that it’s any good.

One of its key sentence is “None of our findings call into question the reality of long-term warming of Earth’s troposphere and surface, or cast doubt on prevailing estimates of the amount of warming we can expect from future increases in (greenhouse gas) concentrations.”

I’m going to call this the Santer Clause because the last half of it is about as real.

Even the first half is interesting because anyone can shift the goal posts and start the trend in whatever year supports their argument. Select the year carefully and you’ll find that temperatures have risen since then, select another year and they]re flat, select another and temperatures have fallen.

The other important sentence in the Santer et al paper is “We conclude that model overestimation of tropospheric warming in the early twenty-first century is partly due to systematic deficiencies
in some of the post-2000 external forcings used in the model simulations.” So it’s not climate models that are wrong; it’s the data put into them, in other words it’s the weather.

Talk about climate denial.

There’s no concession that a more plausible explanation is that climate models are nonsense, as IPCC 5AR showed, and that for the 1980s and 1990s the output of the models looked approximately correct because greenhouses gases were exaggerated while the real drivers of climate, the natural forces and internal variability, were underplayed.

The frequency of El Nino events has slowed since the late 1990s and the dominance of such events over La Nina events has weakened, meaning that warming and cooling episodes are tending to balance and that temperature trends remain flat.

The gap between what the models predict and what the data shows would be smaller if the algorithms in the models were corrected. Of course that’s unlikely to happen because the whole notion of significant manmade warming would implode and the IPCC and UNFCCC disappear. The IPCC will now cite this Santer fantasy to try to ensure that doesn’t happen.

It’s a sobering thought that if the implosion doesn’t happen now and the disconnect between the belief and the reality continues to increase then it’s probably only a matter of time before countries start fudging temperature data, to make it show warming that isn’t happening. They have millions or even billions of dollars at stake if the myth collapsed and surely it’s too big a carrot to give up without a fight.

When the reputation of climate science ends up in the gutter as a result of all the nonsense let’s just hope it’s not Ben Santer who’s called to fix it.

See also John’s IPCC Review “Prejudiced authors, Prejudiced findings” here.

Posted on 07/02 at 04:17 AM
(2) TrackbacksPermalink


Page 1 of 89 pages  1 2 3 >  Last »
Blogroll

Hall of Record

Art Horn’s “The Art of Weather”

The Cornwall Alliance

Weatherbell Analytics

Science and Public Policy Institute

Analysis Online

Ross McKitrick Google Home Page

AMSU Global Daily Temps

Climate Resistance

James Spann’s Blog

Climate Debate Daily

Committee for a Constructive Tomorrow (CFACT)

Global Warming Hoax

Climate Science: Roger Pielke Sr. Research Group Weblog

The Climate Scam

Cornwall Alliance

Warmal Globing

The Inhofe EPW Press Blog

Climate Audit

Warwick Hughes

Roy Spencer’s Nature’s Thermostat

Global Warming Skeptics

Ice Age Now

Marshall Institute Climate Change

Finland Lustia Dendrochronology Project

John Coleman’s Corner

Watts Up with That?

Carbon Folly

The Reference Frame - Lubos Motl’s weblog

Vaclav Klaus, Czech Republic President

Bald-Faced Truth

Greenie Watch

Tallbloke

Earth Changes

Tropical Cyclone Blog of Ryan Maue COAPS

Metsul’s Meteorologia

The New Zealand Climate Science Coalition

Climate Change Fraud

Accuweather Global Warming

I Love My Carbon Dioxide

World Climate Report

Carbonated Climate

Tom Skilling’s Blog

Tom Nelson Blogroll

Blue Hill Observatory, Milton MA

Bob Carter’s Wesbite

Craig James’ Blog

CO2web

Intellicast Dr. Dewpoint

Raptor Education Foundation

Joanne Nova- The Skeptic’s Handbook

Where is Global Warming (Bruce Hall Collection)

Prometheus

Gary Sharp’s It’s All About Time

CO2 Sceptics

COAPS Climate Study US

Musings of the Chiefio

Junk Science

Science and Environmental Policy Project

Climate Skeptic

Global Warming Scare

Dr. Dewpoint on Intellicast

MPU Blog

Energy Tribune

Dr. Roy Spencer

Redneck USA

Powerlineblog

Bill Meck’s Blog

The Heartland Institute

Wisconsin Energy Cooperative

John Daly’s What the Stations Say

Climate Police

Web Commentary

The Weather Wiz

Demand Debate

Climate Debate Daily

APPINYS Global Warming

Raptor Education Foundation

Scientific Alliance

Science Bits

Climate Debate Daily

Omniclimate

Reid Bryson’s Archaeoclimatology

Global Warming Hoax

CO2 Science

The Resilient Earth

Right Side News

Middlebury Community Network on The Great Global Warming Hoax

Climate Research News

Dr. Roy Spencer

Climate Cycle Changes

Digging in the Clay

Anthony Watts Surface Station Photographs

The Week That Was by Fred Singer

Climate Depot

Gore Lied

John McLean’s Global Warming Issues

Blue Crab Boulevard

Landsurface.org, The Niyogi Lab at Purdue

TWTW Newsletters