Frozen in Time
Jul 29, 2014
Introducing the Open Atmospheric Society


At the Heartland Institute ICCC9, Anthony Watts announced the launching of a new society that has been in the works for many month.

Welcome to The Open Atmospheric Society, known as “The OAS”.

We give you a voice where other societies may not.

The OAS is a international membership society for the purpose of studying, discussing, and publishing about topics in atmospheric related earth sciences, including but not limited to meteorology, hydrology, oceanography, and climatology. It is open to anyone with an interest at the associate level, but student and full memberships also are offered.

The purpose of the society is to foster quality atmospheric science and atmospheric science communications through outreach, member education, member publishing, and electronic media. We see the differences highlighted on the web site home page (

Our motto: verum in luce means “truth in the light”.

Open science - a transparent online peer review process. Publishing peer reviewer comments (not names), will illuminate the process.

Open membership - Associate members, anyone who has an interest in atmospheric science, can join at a basic rate, providing interdisciplinary membership.

Professional full members, will require a degree in atmospheric sciences or related earth disciplines, or three published papers in these subjects.

Student members get a reduced rate, similar to associate members with option to full member elevation.

Open journal - The Journal of the OAS will be free to read by the public.

Author account - each author and co-author will have accounts for collaboration, submitting papers , making edits, and responding to reviewers.

No other journal asks this upfront: strict OAS Journal submission requirements, technical submissions to the Journal by members must include all source data, software/code, procedures, and documentation to ensure reproducibility of the paper’s experiment or analysis by external reviewers.

Emphasis on reasonable publication turnaround, generally three months or less.

Press releases will be sent with each publication, author assistance is offered in preparation.

Video production assistance for authors to explain papers and post to the journal page with your paper.

Organizational activity will be conducted entirely online - This means no costly brick and mortar infrastructure, no costly postal mailings journals, and no need for warehousing paper files and publications.

Online meetings conducted via Skype for organizational purposes.

Nomination/Voting for officers and other issues conducted online.

Monthly email newsletters and special online webcasts.

To learn what The OAS is all about, go to the main page:

To join, please go to The OAS Membership Portal

The initial setup of The OAS was made possible by a grant from Stephen and Dr. Mary Graves. We thank them for their foresight and generosity. 

Jul 24, 2014
Get Ready for the New England Power Shortage

William Tucker,

Governors are already meeting in emergency session.

In 1980, under the first administration of Governor Jerry Brown, California decided it wasn’t going to build any more power plants but would follow Amory Lovins’ “soft path,” opting instead for conservation and renewable energy. By 2000, with the new digital economy sucking up electricity, a drought in the Pacific Northwest cut hydropower output and the state found itself facing the Great California Electrical Shortage.

You know what happened next. For weeks the Golden State struggled to find enough electricity to power its traffic lights. Brownouts and blackouts cascaded across the state while businesses fired up smoke-belching diesel generators to keep the lights on. Governor Gray Davis finally got booted out of office but the state didn’t rescue itself until it threw up 12,000 megawatts of new natural gas plants.

At that point California officials decided that the whole thing had been engineered by Enron and other out-of-state merchant providers and the charges and lawsuits flew. No Democrat ever learned a lesson. The state is now 60 percent dependent on natural gas for its electricity - twice the national average - and its electric bills are almost twice that of surrounding states. Industry is headed for the door.

So how have California’s liberal counterparts on the East Coast managed to avoid the same fate? You’d think a region that could produce Elizabeth Warren and Bernie Sanders plus legions of college students trained to hate fossil fuels would have no trouble pursuing the same green dreams. Well, it’s about to happen. In the next few years New England will be facing a full-scale power shortage.

Last week the governors of the six New England states met in an emergency session at Bretton Woods, New Hampshire, to discuss what to do about the pending crisis. Significantly, they asked the premiers of five of Canada’s provinces to attend. That makes sense because if the region is going to get electricity from anywhere it is probably going to be from north of the border.

In a hell-bent campaign to rid itself of any form of dirty, messy “non-renewable” energy, New England has been closing down coal and oil plants for the last decade. In 2000, 18 percent of New England’s electricity came from coal and 22 percent from oil. Today it’s 3 percent coal and 1 percent oil. Meanwhile, natural gas - the fuel that everybody loves until you have to drill for it - has risen from 15 percent to a starkly vulnerable 52 percent, just behind California.

There’s only one problem. New England doesn’t have the pipelines to bring in the gas. Nor is anyone going to allowed to build it, either. Connecticut and Massachusetts are only a short distance from eastern Pennsylvania, where fracking for natural gas has leapfrogged the Keystone State into third place for overall energy production. Yet a proposal by Sempra Energy of Houston to expand its existing pipeline from Stony Point, New York, has already met fierce resistance from people who want nothing more to do with fossil fuels and construction is highly unlikely.

It’s not as if it’s not needed. Last winter, when record low temperatures hit, there just wasn’t enough gas to go around. Utilities that service home heating have long-term contracts and get first dibs. You can’t stockpile gas the way you stockpile coal, so power plant operators were left bidding against each other for what was left. Prices skyrocketed from $4 per mBTU to an unbelievable $79 per mBTU and electricity prices spiked to ten times their normal level. Just to put things in perspective, during the first four months of last winter, New England spent $5.1 billion on electricity. In the whole of 2012, it had spent only $5.2 billion.

And that’s just the beginning. New England is now limping along with 33,000 megawatts of electrical capacity, which barely meets its needs. At one auction last winter, the New England Independent Systems Operator, which manages the grid, came up 145 megawatts short - an almost unheard of occurrence. Yet in the next two years the region will be closing down 1/10th of its capacity in a bid to rid itself of anything that does not win favor with environmentalists. First to go will be the last of four coal plants at Salem Harbor, which can no longer meet the EPA’s new regulatory requirements. Next Brayton Point, the largest remaining coal plant, will be retired for the same reason. Finally, a continual barrage of protests and legislative attacks has persuaded Mississippi-based Entergy to close the Vermont Yankee Nuclear Station and “let the Yankees freeze in the dark,” as they used to say in Texas and Louisiana. The reactor provided 75 percent of Vermont’s electricity and 4 percent of the power for the region, carbon-free.

“It’s going to be very tricky for New England over the next three to four years” says Gordon van Welie, CEO of the Independent Systems Operator of New England, which run the grid. Van Welie begged the region not to close down Vermont Yankee and Brayton Point, but who listens to anyone who understands electricity anymore? Interestingly, New England only got through last winter by regularly importing 1,400 megawatts from Indian Point, the two nuclear plants on the Hudson in neighboring New York. Says New Hampshire energy consultant William P. Short III, “Without Indian Point, New England would have been toast.” As you might expect, New York Governor Andrew Cuomo and most of the state’s Democratic politicians are trying to close down Indian Point as well.

Naturally, all this is falling hardest on people who hold blue-collar jobs. The Gorham Paper and Tissue Company in New Hampshire was forced to reduce production and lay off workers in the depths of last winter. The Great Northern Paper Co. in Maine laid off 200 workers and closed down for four months. In fact, if you want to know why we have “income inequality” and a “disappearing middle class,” look no further than the class warfare being waged on American industry by upper-educated elites snugly ensconced in the digital economy or sitting in Washington writing regulations telling everybody else what to do. “We’re going to have an economy that operates only nine months of the year,” complains Maine Governor Paul LePage, the only Republican in the region.

So where will New England be getting its electricity? Almost daily the newspapers are filled with stories about how the region is “going green” and about to enter the delightful world of “clean energy.” It’s sheer fantasy. No one has the slightest notion of what it would entail. You would have to cover half of the Green Mountains with windmills to recover the power lost at Vermont Yankee and even then it would only work when the wind is blowing. Almost as soon as the news came about the closing of Vermont Yankee, one company proposed building a power plant that burned wood in the wilds of western Massachusetts. However, someone soon discovered that burning wood produces smoke and carbon dioxide as well. It was quickly shouted down. It’s probably just as well. At one point Massachusetts drew up plans to harvest wood for electricity and discovered it would soon strip the state of its forests.

So the only “clean energy” left in New England these days is hydroelectricity - generated in Canada. The Canadians are indeed developing huge dams in James Bay and are eager to sell to Americans. But that means building transmission lines down from the north and everyone is opposed to that as well. Northeast Utilities, which services much of New England, has been trying to build a Northern Pass transmission corridor since 2009 but environmental groups insist the lines be buried underground. Two documentary films - a standard item these days - have already been made opposing the project. Meanwhile, environmentalists have become so ambitious and well funded that they have bought up land and property rights in northern New Hampshire just to block its path. Plans to bury just eight miles of the 187-mile route have ballooned costs from $200 million to $1.4 billion and the project is years from completion - if ever.

So what is likely to happen? Another cold winter is certain to bring skyrocketing prices and possible brownouts. New Englanders already pay 45 percent higher electric bills than the rest of the country and that figure can only grow. The first region of the country to industrialize is about the drive away the last of its blue-collar workshops.

One thing the region will not run short of, however, is political bluster. When the New England States Committee on Electricity (NESCOE) tried to broker a deal to have ratepayers finance a collectively owned gas pipeline, the volunteer organization was lambasted for “conspiring with industry to produce profits” and “failing to consider all the renewable alternatives.” When the crisis finally arrives this winter or next, you can be sure Vermont’s socialist Senator Bernie Sanders will be at the head of the pack, braying that the whole thing has been caused by “speculators.” Or bad planning by power companies - never bad policies by politicians or the EPA, aka the Employment Prevention Agency.

Jul 21, 2014
Joe Bastardi: Media Just ‘Want to Be Popular’ on Climate Change

Joe Bastardi

Climate alarmists sometimes like to claim skeptical scientists don’t exist, but they do, and one meteorologist had a lot to say on the subject.

In an interview with the MRC’s Business and Media Institute, well-known meteorologist Joe Bastardi dissected and criticized major aspects of the climate change alarmism movement. Drawing on his knowledge of weather and climate history, Bastardi said that “extreme weather” events the media talk about so much are commonplace and the result of normal variability. He also attacked basic arguments about CO2, scientific consensus and alarmist media bias.

Bastardi contended that climate alarmism is “ludicrous” and “not about science.” He didn’t mince words, blasting climate alarmists as living in a “loony world.” He also criticized news media that “simply follow what the majority thinks because they want to be popular” instead of understanding the issue.

After many years working with AccuWeather, Bastardi is now a weather forecaster for Weatherbell Analytics LLC, where he is tasked with making accurate and objective forecasts for private sector clients. He has appeared on many news broadcasts, including Fox News, CBS and ABC.

Bastardi specializes in understanding the history of weather and climate and said he used this understanding to accurately predict both Hurricane Sandy and Hurricane Arthur ahead of official warnings. In contrast, he said alarmists “don’t know what happened yesterday” or “know anything about what happened in the past.”

As Bastardi pointed out, he has to make accurate predictions in order to be paid and has no political motivation.

Bastardi asserted that alarmist scientists cannot actually be objective, because they are given grants explicitly to study global warming. He called upon alarmists to answer just one question. “What would it take for you to change? What do you need to see?” If this question cannot be answered, he argued, then there is nothing empirical behind their arguments that man is causing catastrophic climate change.

Addressing the science, Bastardi noted the entire world only adds 1.8 ppm of CO2 in a year, he called the notion that this miniscule amount causes extreme weather “ludicrous.” Instead, he said alarmists are just “taking every weather event they can find” and connecting it to climate change. He urged people to ask themselves if they believe that amount of carbon dioxide could have the impact that climate alarmists claim it does.

He also criticized the so-called consensus of alarm among climate scientists, saying “it is easy to have consensus when someone will pay you to have one.” Bastardi was very firm in the belief that the climate movement was “not about science” but had become a movement “from people who make profits” off the hysteria.

In his view, the alarmists’ goal was the “restructuring of the way our society works,” and the consequences if their economic recommendations are adopted are “suicide for our way of life.” Furthermore, he pointed out that the United States alone cannot make an impact on global emissions, as it releases only 10 percent of the world’s carbon dioxide. He joked “What are we going to do? Go to war with the rest of the globe [and force them to cut emissions?”

Jul 14, 2014
Confessions of a Computer Modeler

By Robert J. Caprara July 8, 2014 7:15 p.m.  THE WALL STREET JOURNAL

Confessions of a Computer Modeler

Any model, including those predicting climate doom, can be tweaked to yield a desired result. I should know.

The climate debate is heating up again as business leaders, politicians and academics bombard us with the results of computer models that predict costly and dramatic changes in the years ahead. I can offer some insight into the use of computer models for public-policy debates, and a recommendation for the general public.

After earning a master’s degree in environmental engineering in 1982, I spent most of the next 10 years building large-scale environmental computer models. My first job was as a consultant to the Environmental Protection Agency. I was hired to build a model to assess the impact of its Construction Grants Program, a nationwide effort in the 1970s and 1980s to upgrade sewer-treatment plants.

The computer model was huge - it analyzed every river, sewer treatment plant and drinking-water intake (the places in rivers where municipalities draw their water) in the country. I’ll spare you the details, but the model showed huge gains from the program as water quality improved dramatically. By the late 1980s, however, any gains from upgrading sewer treatments would be offset by the additional pollution load coming from people who moved from on-site septic tanks to public sewers, which dump the waste into rivers. Basically the model said we had hit the point of diminishing returns.

When I presented the results to the EPA official in charge, he said that I should go back and “sharpen my pencil.” I did. I reviewed assumptions, tweaked coefficients and recalibrated data. But when I reran everything the numbers didn’t change much. At our next meeting he told me to run the numbers again.

After three iterations I finally blurted out, “What number are you looking for?” He didn’t miss a beat: He told me that he needed to show $2 billion of benefits to get the program renewed. I finally turned enough knobs to get the answer he wanted, and everyone was happy.

Was the EPA official asking me to lie? I have to give him the benefit of the doubt and assume he believed in the value of continuing the program. (Congress ended the grants in 1990.) He certainly didn’t give any indications otherwise. I also assume he understood the inherent inaccuracies of these types of models. There are no exact values for the coefficients in models such as these. There are only ranges of potential values. By moving a bunch of these parameters to one side or the other you can usually get very different results, often (surprise) in line with your initial beliefs.

I realized that my work for the EPA wasn’t that of a scientist, at least in the popular imagination of what a scientist does. It was more like that of a lawyer. My job, as a modeler, was to build the best case for my client’s position. The opposition will build its best case for the counter argument and ultimately the truth should prevail.

If opponents don’t like what I did with the coefficients, then they should challenge them. And during my decade as an environmental consultant, I was often hired to do just that to someone else’s model. But there is no denying that anyone who makes a living building computer models likely does so for the cause of advocacy, not the search for truth.

Surely the scientific community wouldn’t succumb to these pressures like us money-grabbing consultants. Aren’t they laboring for knowledge instead of profit? If you believe that, boy do I have a computer model to sell you.

The academic community competes for grants, tenure and recognition; consultants compete for clients. And you should understand that the lines between academia and consultancy are very blurry as many professors moonlight as consultants, authors, talking heads, etc.

Let’s be clear: I am not saying this is a bad thing. The legal system is adversarial and for the most part functions well. The same is true for science. So here is my advice: Those who are convinced that humans are drastically changing the climate for the worse and those who aren’t should accept and welcome a vibrant, robust back-and-forth. Let each side make its best case and trust that the truth will emerge.

Those who do believe that humans are driving climate change retort that the science is “settled” and those who don’t agree are “deniers” and “flat-earthers.” Even the president mocks anyone who disagrees. But I have been doing this for a long time, and the one thing I have learned is how hard it is to convince people with a computer model. The vast majority of your audience will never, ever understand the math behind it. This does not mean people are dumb. They usually have great BS detectors, and when they see one side of a debate trying to shut down the other side, they will most likely assume it has something to hide, has the weaker argument, or both.

Eventually I got out of the environmental consulting business. In the 1990s I went into a completely different industry, one that was also data intensive and I thought couldn’t be nearly as controversial: health care. But that’s another story.

Mr. Caprara is chief methodologist for PSKW LLC, which provides marketing programs for pharmaceutical firms.

Alan Carlin formerly of the EPA who challenged the EPA Endangerment finding was given a whistleblower award for his challenge to the Endangerment Finding. See his award after Pat Michaels keynote address.

Broadcast live streaming video on Ustream

Jul 11, 2014
Data Set Changes Makes It Hard To Tell Real Story

By Joseph D’Aleo, CCM

In the story below in Icecap in the News on “Welcome back to the 1950s and soon the 1960s and 1970s and then 1800” I showed how temperatures correlate well with natural cycles. 

At Weatherbell, Joe Bastardi and I use data sets in our work to correlate with the many natural cycles as measured in teleconnections. We generally use the reanalysis data sets from NCEP and the climate division data from NOAA.


The most reliable data sets are likely the satellite sets because of very nearly global coverage and the lack of contamination from local heat sources. The average of the two satellite sets (RSS and UAH) shows a step change with the super El Nino and little change post 1998.


But the NOAA temperature land/ocean based data sets keep changing. Ryan reassures us the reanalysis data set, which is based on hourly observations is not adjusted. Climate division data though is a different story. So when we do a composite after June for a set of years, we will get a different result. There are three major surface data centers, NOAA in Asheville, NASA Goddard in New York City, and Hadley in the UK.

But we here I’ll focus on NOAA here. NOAA is the source of the base data that NASA and Hadley use before adding their own adjustments and additional data (NASA Antarctic and extrapolated arctic and Hadley their own ocean data).

The first USHCN 1221 station data set was released in 1990 with a UHI adjustment. It was generally regarded as the world’s best because of the long history for most stations and the stability of the network.


In the 1999 USHCN plot, it showed the 1930s was the warmest decade and 1934 the warmest year (1.1F warmer than the super El Nino of 1997/98). Hansen said so in this plot on the GISS web site of the NOAA data set. The trend peak to peak of the 5 year mean was down.

I praised NOAA NCDC for its efforts to get it right although a decade later we discovered siting in many cases increasingly did not meet standards thank to a move to ASOS with sensors near tarmac, changing technology requiring cabling of the sensors to the weather office often putting the sensors near buildings, driveways, parking lots, air conditioning exhausts or other heat sources. Numerous sites were placed at heat generating waste treatment plants. Even the GAO scolded NOAA for over 40% of their stations not meeting the minimum criteria NOAA itself had in the specifications. The UHI adjustment offset some of these biases.

Most of the original correlation studies I did used this data set. But the inconsistency of warming relative to the global data set (GHCN), which showed much more warming late century put political pressure on NCDC to make the two data sets more consistent. Instead of trying to find the metadata to add UHI adjustments to the GHCN, the changes were made to the US adjustments.

They removed the UHI adjustment, replacing it with algorithms to find previously undocumented inhomogeneities (station moves), and a change to the time of observation adjustment and a new final step with homogenization (blending of data stations). 2 versions later the data set shows the changes made to the annual values in degrees F. The warming increased 0.8F from 1930 to 1990 simply due to adjustments. There was with the adjustments, cooling of the 1920s to 1950s and warming post the 1960s.


UHI mainly affects minimum temperatures. Maximum temperatures are a better indicator of the true trends. Even bias corrected maxima shows the recent peak no higher than the 1930s.



The minima (and in the end the mean) clearly shows the recent warmth which is consistent with UHI.


The global set meanwhile has undergone changes too since V1. V3 and V1 are plotted here. The reasons include changing station sets and application of the same kinds of adjustments.


The difference is even more remarkable than the US (1.2F or so). In other words most of the apparent warming in the data set since the late 1800s is in the adjustments. This is the CHANGE made to the global mean temperature NOT the global mean temperature.


The Latest Change - US state data.

Locally we used the NCDC USHCN Climate at a Glance in talks. I gave one such invited talk in Maine to a local group a few years ago.

Here was the Maine plot of annual temperatures the last time I downloaded it. It showed no trend since 1895. The warmest year was 1913 with a bookend spike a century later in 2010. Both followed high latitude volcanism, in 1913 following 1912 Novarupta in Alaska and 2010 following series of eruptions in Alaska and Iceland. These high latitude volcanoes produce more high latitude blocking (Oman 2004). This is especially true with very low solar geomagnetic years. These blocks mean a maritime flow into southeast Canada and Maine making the winters and springs less snowy and cold.


This spring, NCDC announced a new version of the US data set replacing the USHCN used at Climate at a Glance with a new GHCN based Climate DIvision alternative. Maine suddenly has a 0.23/decade warming trend and 1913 was 5F colder and not even close to 2010.


We can still access in places the original raw data. The raw rural data for Farmington Maine, near one of the big ski areas is very instructive and shows a nice sine wave in sync with the 60-70 year PDO/AMO cycles and more like USHCN v1 and TMAX. The station was discontinued after 2006. I’ll let you speculate why.


Concord, NH is an urban area, but the airport is a well-known cold spot compared to Manchester, which is often the warmest station in the state. The sensors must be well placed.

I had to do a winter plot of Concord temperatures since 1868/69 and found little change over the entire period of record - again this is raw archived data downloaded from the PWM web site.


The record highs and lows are also not adjusted and look more like USHCN v1 with 23 record highs in the 1930s and 38 before 1960 and more record cold than highs since the 1940s.


The number of 90F temperatures at all USHCN station (raw, unadjusted) shows a similar downtrend.


Keep these in mind when you hear NOAA give a ranking of a month or year or hear the speeches why POTUS through the EPA is taking strong action.

Page 16 of 265 pages « First  <  14 15 16 17 18 >  Last »