Cold Storage
Jul 21, 2014
Joe Bastardi: Media Just ‘Want to Be Popular’ on Climate Change

Joe Bastardi

Climate alarmists sometimes like to claim skeptical scientists don’t exist, but they do, and one meteorologist had a lot to say on the subject.

In an interview with the MRC’s Business and Media Institute, well-known meteorologist Joe Bastardi dissected and criticized major aspects of the climate change alarmism movement. Drawing on his knowledge of weather and climate history, Bastardi said that “extreme weather” events the media talk about so much are commonplace and the result of normal variability. He also attacked basic arguments about CO2, scientific consensus and alarmist media bias.

Bastardi contended that climate alarmism is “ludicrous” and “not about science.” He didn’t mince words, blasting climate alarmists as living in a “loony world.” He also criticized news media that “simply follow what the majority thinks because they want to be popular” instead of understanding the issue.

After many years working with AccuWeather, Bastardi is now a weather forecaster for Weatherbell Analytics LLC, where he is tasked with making accurate and objective forecasts for private sector clients. He has appeared on many news broadcasts, including Fox News, CBS and ABC.

Bastardi specializes in understanding the history of weather and climate and said he used this understanding to accurately predict both Hurricane Sandy and Hurricane Arthur ahead of official warnings. In contrast, he said alarmists “don’t know what happened yesterday” or “know anything about what happened in the past.”

As Bastardi pointed out, he has to make accurate predictions in order to be paid and has no political motivation.

Bastardi asserted that alarmist scientists cannot actually be objective, because they are given grants explicitly to study global warming. He called upon alarmists to answer just one question. “What would it take for you to change? What do you need to see?” If this question cannot be answered, he argued, then there is nothing empirical behind their arguments that man is causing catastrophic climate change.

Addressing the science, Bastardi noted the entire world only adds 1.8 ppm of CO2 in a year, he called the notion that this miniscule amount causes extreme weather “ludicrous.” Instead, he said alarmists are just “taking every weather event they can find” and connecting it to climate change. He urged people to ask themselves if they believe that amount of carbon dioxide could have the impact that climate alarmists claim it does.

He also criticized the so-called consensus of alarm among climate scientists, saying “it is easy to have consensus when someone will pay you to have one.” Bastardi was very firm in the belief that the climate movement was “not about science” but had become a movement “from people who make profits” off the hysteria.

In his view, the alarmists’ goal was the “restructuring of the way our society works,” and the consequences if their economic recommendations are adopted are “suicide for our way of life.” Furthermore, he pointed out that the United States alone cannot make an impact on global emissions, as it releases only 10 percent of the world’s carbon dioxide. He joked “What are we going to do? Go to war with the rest of the globe [and force them to cut emissions?”

Jul 14, 2014
Confessions of a Computer Modeler

By Robert J. Caprara July 8, 2014 7:15 p.m.  THE WALL STREET JOURNAL

Confessions of a Computer Modeler

Any model, including those predicting climate doom, can be tweaked to yield a desired result. I should know.

The climate debate is heating up again as business leaders, politicians and academics bombard us with the results of computer models that predict costly and dramatic changes in the years ahead. I can offer some insight into the use of computer models for public-policy debates, and a recommendation for the general public.

After earning a master’s degree in environmental engineering in 1982, I spent most of the next 10 years building large-scale environmental computer models. My first job was as a consultant to the Environmental Protection Agency. I was hired to build a model to assess the impact of its Construction Grants Program, a nationwide effort in the 1970s and 1980s to upgrade sewer-treatment plants.

The computer model was huge - it analyzed every river, sewer treatment plant and drinking-water intake (the places in rivers where municipalities draw their water) in the country. I’ll spare you the details, but the model showed huge gains from the program as water quality improved dramatically. By the late 1980s, however, any gains from upgrading sewer treatments would be offset by the additional pollution load coming from people who moved from on-site septic tanks to public sewers, which dump the waste into rivers. Basically the model said we had hit the point of diminishing returns.

When I presented the results to the EPA official in charge, he said that I should go back and “sharpen my pencil.” I did. I reviewed assumptions, tweaked coefficients and recalibrated data. But when I reran everything the numbers didn’t change much. At our next meeting he told me to run the numbers again.

After three iterations I finally blurted out, “What number are you looking for?” He didn’t miss a beat: He told me that he needed to show $2 billion of benefits to get the program renewed. I finally turned enough knobs to get the answer he wanted, and everyone was happy.

Was the EPA official asking me to lie? I have to give him the benefit of the doubt and assume he believed in the value of continuing the program. (Congress ended the grants in 1990.) He certainly didn’t give any indications otherwise. I also assume he understood the inherent inaccuracies of these types of models. There are no exact values for the coefficients in models such as these. There are only ranges of potential values. By moving a bunch of these parameters to one side or the other you can usually get very different results, often (surprise) in line with your initial beliefs.

I realized that my work for the EPA wasn’t that of a scientist, at least in the popular imagination of what a scientist does. It was more like that of a lawyer. My job, as a modeler, was to build the best case for my client’s position. The opposition will build its best case for the counter argument and ultimately the truth should prevail.

If opponents don’t like what I did with the coefficients, then they should challenge them. And during my decade as an environmental consultant, I was often hired to do just that to someone else’s model. But there is no denying that anyone who makes a living building computer models likely does so for the cause of advocacy, not the search for truth.

Surely the scientific community wouldn’t succumb to these pressures like us money-grabbing consultants. Aren’t they laboring for knowledge instead of profit? If you believe that, boy do I have a computer model to sell you.

The academic community competes for grants, tenure and recognition; consultants compete for clients. And you should understand that the lines between academia and consultancy are very blurry as many professors moonlight as consultants, authors, talking heads, etc.

Let’s be clear: I am not saying this is a bad thing. The legal system is adversarial and for the most part functions well. The same is true for science. So here is my advice: Those who are convinced that humans are drastically changing the climate for the worse and those who aren’t should accept and welcome a vibrant, robust back-and-forth. Let each side make its best case and trust that the truth will emerge.

Those who do believe that humans are driving climate change retort that the science is “settled” and those who don’t agree are “deniers” and “flat-earthers.” Even the president mocks anyone who disagrees. But I have been doing this for a long time, and the one thing I have learned is how hard it is to convince people with a computer model. The vast majority of your audience will never, ever understand the math behind it. This does not mean people are dumb. They usually have great BS detectors, and when they see one side of a debate trying to shut down the other side, they will most likely assume it has something to hide, has the weaker argument, or both.

Eventually I got out of the environmental consulting business. In the 1990s I went into a completely different industry, one that was also data intensive and I thought couldn’t be nearly as controversial: health care. But that’s another story.

Mr. Caprara is chief methodologist for PSKW LLC, which provides marketing programs for pharmaceutical firms.

Alan Carlin formerly of the EPA who challenged the EPA Endangerment finding was given a whistleblower award for his challenge to the Endangerment Finding. See his award after Pat Michaels keynote address.



Broadcast live streaming video on Ustream

Jul 11, 2014
Data Set Changes Makes It Hard To Tell Real Story

By Joseph D’Aleo, CCM

In the story below in Icecap in the News on “Welcome back to the 1950s and soon the 1960s and 1970s and then 1800” I showed how temperatures correlate well with natural cycles. 

At Weatherbell, Joe Bastardi and I use data sets in our work to correlate with the many natural cycles as measured in teleconnections. We generally use the reanalysis data sets from NCEP and the climate division data from NOAA.

image

The most reliable data sets are likely the satellite sets because of very nearly global coverage and the lack of contamination from local heat sources. The average of the two satellite sets (RSS and UAH) shows a step change with the super El Nino and little change post 1998.

image

But the NOAA temperature land/ocean based data sets keep changing. Ryan reassures us the reanalysis data set, which is based on hourly observations is not adjusted. Climate division data though is a different story. So when we do a composite after June for a set of years, we will get a different result. There are three major surface data centers, NOAA in Asheville, NASA Goddard in New York City, and Hadley in the UK.

But we here I’ll focus on NOAA here. NOAA is the source of the base data that NASA and Hadley use before adding their own adjustments and additional data (NASA Antarctic and extrapolated arctic and Hadley their own ocean data).

The first USHCN 1221 station data set was released in 1990 with a UHI adjustment. It was generally regarded as the world’s best because of the long history for most stations and the stability of the network.

image

In the 1999 USHCN plot, it showed the 1930s was the warmest decade and 1934 the warmest year (1.1F warmer than the super El Nino of 1997/98). Hansen said so in this plot on the GISS web site of the NOAA data set. The trend peak to peak of the 5 year mean was down.

I praised NOAA NCDC for its efforts to get it right although a decade later we discovered siting in many cases increasingly did not meet standards thank to a move to ASOS with sensors near tarmac, changing technology requiring cabling of the sensors to the weather office often putting the sensors near buildings, driveways, parking lots, air conditioning exhausts or other heat sources. Numerous sites were placed at heat generating waste treatment plants. Even the GAO scolded NOAA for over 40% of their stations not meeting the minimum criteria NOAA itself had in the specifications. The UHI adjustment offset some of these biases.

Most of the original correlation studies I did used this data set. But the inconsistency of warming relative to the global data set (GHCN), which showed much more warming late century put political pressure on NCDC to make the two data sets more consistent. Instead of trying to find the metadata to add UHI adjustments to the GHCN, the changes were made to the US adjustments.

They removed the UHI adjustment, replacing it with algorithms to find previously undocumented inhomogeneities (station moves), and a change to the time of observation adjustment and a new final step with homogenization (blending of data stations). 2 versions later the data set shows the changes made to the annual values in degrees F. The warming increased 0.8F from 1930 to 1990 simply due to adjustments. There was with the adjustments, cooling of the 1920s to 1950s and warming post the 1960s.

image

UHI mainly affects minimum temperatures. Maximum temperatures are a better indicator of the true trends. Even bias corrected maxima shows the recent peak no higher than the 1930s.

image

image

The minima (and in the end the mean) clearly shows the recent warmth which is consistent with UHI.

image

The global set meanwhile has undergone changes too since V1. V3 and V1 are plotted here. The reasons include changing station sets and application of the same kinds of adjustments.

image

The difference is even more remarkable than the US (1.2F or so). In other words most of the apparent warming in the data set since the late 1800s is in the adjustments. This is the CHANGE made to the global mean temperature NOT the global mean temperature.

image

The Latest Change - US state data.

Locally we used the NCDC USHCN Climate at a Glance in talks. I gave one such invited talk in Maine to a local group a few years ago.

Here was the Maine plot of annual temperatures the last time I downloaded it. It showed no trend since 1895. The warmest year was 1913 with a bookend spike a century later in 2010. Both followed high latitude volcanism, in 1913 following 1912 Novarupta in Alaska and 2010 following series of eruptions in Alaska and Iceland. These high latitude volcanoes produce more high latitude blocking (Oman 2004). This is especially true with very low solar geomagnetic years. These blocks mean a maritime flow into southeast Canada and Maine making the winters and springs less snowy and cold.

image

This spring, NCDC announced a new version of the US data set replacing the USHCN used at Climate at a Glance with a new GHCN based Climate DIvision alternative. Maine suddenly has a 0.23/decade warming trend and 1913 was 5F colder and not even close to 2010.

image

We can still access in places the original raw data. The raw rural data for Farmington Maine, near one of the big ski areas is very instructive and shows a nice sine wave in sync with the 60-70 year PDO/AMO cycles and more like USHCN v1 and TMAX. The station was discontinued after 2006. I’ll let you speculate why.

image

Concord, NH is an urban area, but the airport is a well-known cold spot compared to Manchester, which is often the warmest station in the state. The sensors must be well placed.

I had to do a winter plot of Concord temperatures since 1868/69 and found little change over the entire period of record - again this is raw archived data downloaded from the PWM web site.

image

The record highs and lows are also not adjusted and look more like USHCN v1 with 23 record highs in the 1930s and 38 before 1960 and more record cold than highs since the 1940s.

image

The number of 90F temperatures at all USHCN station (raw, unadjusted) shows a similar downtrend.

image

Keep these in mind when you hear NOAA give a ranking of a month or year or hear the speeches why POTUS through the EPA is taking strong action.

Jul 06, 2014
The Daydream and the Nightmare

Peggy Noonan

Obama isn’t doing his job. He’s waiting for history to recognize his greatness.

I don’t know if we sufficiently understand how weird and strange, how historically unparalleled, this presidency has become. We’ve got a sitting president who was just judged in a major poll to be the worst since World War II. The worst president in 70 years! Quinnipiac University’s respondents also said, by 54% to 44%, that the Obama administration is not competent to run the government. A Zogby Analytics survey asked if respondents are proud or ashamed of the president. Those under 50 were proud, while those over 50, who have of course the longest experienced sense of American history, were ashamed.

We all know the reasons behind the numbers. The scandals that suggest poor stewardship and, in the case of the IRS, destructive political mischief. The president’s signature legislation, which popularly bears his name and contains within it the heart of his political meaning, continues to wreak havoc in marketplaces and to be unpopular with the public. He is incapable of working with Congress, the worst at this crucial aspect of the job since Jimmy Carter, though Mr. Carter at least could work with the Mideast and produced the Camp David Accords. Mr. Obama has no regard for Republicans and doesn’t like to be with Democrats. Internationally, small states that have traditionally been the locus of trouble (the Mideast) are producing more of it, while large states that have been more stable in their actions (Russia, China) are newly, starkly aggressive.

That’s a long way of saying nothing’s working. Which I’m sure you’ve noticed.

But I’m not sure people are noticing the sheer strangeness of how the president is responding to the lack of success around him. He once seemed a serious man. He wrote books (ghost writer - Bill Ayers), lectured on the Constitution. Now he seems unserious, frivolous, shallow. He hangs with celebrities, plays golf. His references to Congress are merely sarcastic: “So sue me.” They don’t do anything except block me. And call me names. It can’t be that much fun.”

In a truly stunning piece in early June, Politico’s Carrie Budoff Brown and Jennifer Epstein interviewed many around the president and reported a general feeling that events have left him - well, changed. He is “taking fuller advantage of the perquisites of office,” such as hosting “star-studded dinners that sometimes go on well past midnight.” He travels, leaving the White House more in the first half of 2014 than any other time of his presidency except his re-election year. He enjoys talking to athletes and celebrities, not grubby politicians, even members of his own party. He is above it all.

On his state trip to Italy in the spring, he asked to spend time with “interesting Italians.” They were wealthy, famous. The dinner went for four hours. The next morning his staff were briefing him for a “60 Minutes:” interview about Ukraine and health care. “One aide paraphrased Obama’s response: ‘Just last night I was talking about life and art, big interesting things, and now we’re back to the minuscule things on politics.’”

Minuscule? Politics is his job.

When the crisis in Ukraine escalated in March, White House aides wondered if Mr. Obama should cancel a planned weekend golf getaway in Florida. He went. At the “lush Ocean Reef Club,” he reportedly told his dinner companions: “I needed this. I needed the golf. I needed to laugh. I needed to spend time with friends.”

You get the impression his needs are pretty important in his hierarchy of concerns.  Read more at the Wall Street Journal.

Jul 02, 2014
The data games - the transition from real data to model/data hybrids

By Joseph D’Aleo, CCM

You probably have been following the saga about USHCN data fabrication/estimation at WattsUpWiththat summarized here and illuminated by Judith Curry.

Anthony finds himself agreeing with Steve Goddard aka Tony Heller who deserves the credit for the initial findings including:

image
Enlarged

Anthony continues: “Paul Homewood deserves the credit for taking the finding and establishing it in a more comprehensible way that opened closed eyes, including mine, in this post entitled Massive Temperature Adjustments At Luling, Texas

image
Enlarged

Along with that is his latest follow up, showing the problem isn’t limited to Texas, but also in Kansas. “

It appears in summary:

Approximately 40% of the data has been estimated, even though they have a lot of good data in hand. The data isn’t making the migration for the RAW to the FINAL USHCN file due to some error in the data flag.

Also, there’s the issue of “Zombie weather stations” Closed stations like Marysville, CA that closed due to may expose’ in 2007 are still reporting data in the FINAL USHCN file because the FILNET program is “infilling” them with estimated data based on surrounding stations.

image
Enlarged

Since 80% of the network has compromised siting, the data used to infill is compromised.

It’s a real mess.

Anthony adds “So far just USA for this error, we don’t know about GHCN yet.”

ICEAP NOTE:  I have posted that I had downloaded from NCDC Climate at a Glance the Maine state Annual temperatures in 2013 for a talk and it showed no warming since 1895! (the trend was shown as -0.03/decade). Well after NOAA announced a transition to the CLIMDIV version of USHCN at the end of this brutal winter, I decided to download the new plot. The new CLIMDIV data was supposed to resolve issues with recent station moves, transition to airport, to new MMTS technology and UHI and siting issues with improvements late in the record, we were very surprised to see the biggest changes to the early data set.  1913 went from the warmest year in the record to the middle of the pack with a cooling of close to 5F!. The log term average dropped over 1F.  The long term trend rose to +0.23F/decade, the largest of the 50 states.

image
Enlarged

image
Enlarged

Basing government energy and tax policy on corrupted data ensures nothing but pain for only government gain.

Please help continue our work which has been almost entirely pro bono. Use the donate button in the left column. Any amount is appreciated.

image

Page 3 of 251 pages « First  <  1 2 3 4 5 >  Last »