U.S. justices to hear challenge to Obama on climate change
Tue, Oct 15 11:06 AM EDT
By Lawrence Hurley
WASHINGTON (Reuters) In a blow to the Obama administration, the Supreme Court on Tuesday agreed to hear a challenge to part of the U.S. Environmental Protection Agency’s first wave of regulations aimed at tackling climate change.
By agreeing to hear a single question of the many presented by nine different petitioners, the court set up its biggest environmental dispute since 2007.
That question is whether the EPA correctly determined that its decision to regulate greenhouse gas emissions from motor vehicles necessarily required it also to regulate emissions from stationary sources.
The EPA regulations are among President Barack Obama’s most significant measures to address climate change. The U.S. Senate in 2010 scuttled his effort to pass a federal law that would, among other things, have set a cap on greenhouse gas emissions.
Mike Stopa, running for congress is a graduate of Suffield Academy and holds an undergraduate degree in Astronomy from Wesleyan University and a Ph. D. in Physics from the University of Maryland. Mike teaches in the Physics Department at Harvard University, where he has served as Research Physicist and Senior Computational Physicist since 2004. He specializes in computation and nanoscience. He is the director of the National Nanotechnology Infrastructure Network Computation Project and a recognized expert on nanoscale electronics and computation. He has over 75 publications and over a thousand citations to his work in the physics literature.
By Steve Goreham
Originally published in The Washington Times
As part of his climate change initiative announced in June, President Obama declared, “Today I’m calling for an end of public financing for new coal plants overseas unless they deploy carbon capture technologies, or there’s no other viable way for the poorest countries to generate electricity.” Restrictions on financing will reduce the supply and increase the cost of electrical power in developing nations, prolonging global poverty.
The World Bank has followed President Obama’s lead, announcing a shift in policy in July, stating that they will provide, “financial support for greenfield coal power generation projects only in rare circumstances.” The bank has provided hundreds of millions in funding for decades to coal-fired projects throughout the developing world.
Also in July, the Export-Import Bank denied financing for the proposed Thai Binh Two coal-fired power plant in Vietnam after “careful environmental review.” While 98 percent of the population of Vietnam has access to electricity, Vietnamese consume only about 1,100 kilowatt-hours per person per year, about one-twelfth of United States usage. Electricity consumption grew 34 percent in Vietnam from 2008 to 2011. The nation needs more power and international funds for coal-fired power projects. But western ideologues try to prevent Vietnam from using coal.
The President, the World Bank, and the Export-Import Bank have accepted the ideology of Climatism, the belief that mankind is causing dangerous climate change. By restricting loans to poor nations, they hope to stop the planet from warming. But what is certain is that their new policies will raise the cost of electricity in poor nations and prolong global poverty.
In most markets, coal is the lowest-cost fuel for producing electricity. According to the International Energy Agency (IEA), world coal and peat usage increased from 24.6 percent of the world’s primary energy supply in 1973 to 28.8 percent of supply in 2011. By comparison, electricity generated from wind and solar sources supplied less that 1 percent of global needs in 2011.
The cost of electricity from natural gas rivals that of coal in the United States, thanks to the hydrofracturing revolution. But natural gas remains a regional fuel. Natural gas prices in Europe are double those in the US and prices in Japan are triple US prices. Until the fracking revolution spreads across the world, the lowest cost fuel for electricity remains coal.
Despite our President’s endorsement, carbon capture technologies are far from a proven solution for electrical power. According the US Department of Energy, carbon capture adds 70 percent to the cost of electricity. In addition, huge quantities of captured carbon dioxide must be transported and stored underground, adding additional cost. There are no utilities currently using carbon capture on a commercial scale.
Coal usage continues to grow. Global coal consumption grew 2.5 percent from 2011 to 2012, the fastest growing hydrocarbon fuel. In 2011, coal was the primary fuel for electricity production in Poland (95%), South Africa (93%), India (86%), China (84%), Australia (72%), Germany (47%), the United States (45%), and Korea (44%). Should we now forbid coal usage in developing nations?
President Obama has stated, “...countries like China and Germany are going all in in the race for clean energy.” But China and Germany are huge coal users and usage is increasing in both nations. More than 50 percent of German electricity now comes from coal as coal fills the gap from closing nuclear plants. Today, China consumes more than 45 percent of the world’s total coal production.
ICECAP NOTE: While the enviro push for renewable wind and solar has caused electricity prices to skyrocket (double) and have 300,000 homes in Germany without electricity because they can’t afford to pay the price. That was in the winter, one of the worst in a century. Are you ready for your electricity prices and gasoline to double as the Obama administration has promised? That on top of the doubling of your cost for health care that the demagogue party has forced on us while promising lower cost and we could keep our doctor. All this for a failed theory that CO2 will have catastrophic effects when there has been none for 17 years as CO2 has increased 1%.
Today, more than 1.2 billion people do not have access to electricity. Hundreds of millions of others struggle with unreliable power. Power outages interrupt factory production, students walk to airports to read under the lights, and schools and hospitals lack vital electrical power.
Electricity is the foundation of a modern industrialized nation. Lack of electricity means poverty, disease, and shortened lifespans. Foolish climate policies lock chains on developing nations.
Steve Goreham is Executive Director of the Climate Science Coalition of America and author of the new book The Mad, Mad, Mad World of Climatism: Mankind and Climate Change Mania.
“It occurred to me...” every saying has a contradictory saying except “ignorance is bliss”. If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest.
T H Huxley said,
“The great tragedy of science - the slaying of a beautiful hypothesis by an ugly fact.”
The tragic fact is that global temperature has declined slightly for 17 years while CO2 levels increased. The Intergovernmental Panel on Climate Change (IPCC) hypothesis said that if CO2 increased temperature would increase. The hypothesis is slain.
Instead of acknowledging the hypothesis is wrong, as science requires, the defenders advance bizarre explanations none of which bear examination. According to the IPCC what is happening can’t happen. They were over 90 percent certainty of their results and planned to increase that certitude to 95 percent in their next Report (AR5).
Defenders are making ludicrous and contradictory claims to explain what is happening. They said they were 90+ percent certain warming since 1950 was due to human CO2 with natural causes of little or no consequence. Now they’re saying the lack of warming of the last 17 years is because of natural variability and decreasing solar activity.
The sad thing is leaked emails revealed they knew all along that the evidence doesn’t support what they were saying. In October 2009 Kevin Trenberth, a major architect in the IPCC deceptions, wrote,
“Well I have my own article on where the heck is global warming?...The fact is that we can’t account for the lack of warming at the moment, and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.”
Notice his explanation in the last sentence. He acknowledged this paucity in 1995 following the Release of a study on weather data by the National Research Council. He said,
“It’s very clear we do not have a climate of the serving system. This may be a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.”
The amount of data has decreased since that time Despite this he worked with the IPCC building computer models that are totally dependent on the amount and accuracy of the data. Despite this he signed the Bali Declaration that said in part,
“The 2007 IPCC report, compiled by several hundred climate scientists, has unequivocally concluded that our climate is warming rapidly, and that we are now at least 90% certain that this is mostly due to human activities.”
Take note of the signatures, because one day we must hold these people accountable.
Trenberth was among the first to defend against the contradictory evidence. He did what was standard practice in the cover ups, provide a response for the media. It is particularly helpful if it cannot be verified or is so remote or arcane that nobody can definitively reject it. He and fellow NCAR employee John Fasullo published a paper suggesting the heat was being stored in the deep oceans. The limitations of this claim are already available. As Anthony Watts writes,
“My question is; show me why some years the deep ocean doesn’t mask global warming.”
Another claim, also made by a senior IPCC member, says 17 years is inadequate to determine anything, a minimum of 30 years is required. The 30 year claim is another of the diversions created in climatology for one purpose, but misused for another. Years ago when people were trying to reconstruct conditions for periods in the past the World Meteorological (WMO) decide that a modern period of instrumental record was required for comparison. It became known as the 30-year Normal. Sadly, it was misused and became a representative for the total instrumental record. For example, in most cases when they say, it was above normal today, they are talking about being above the 30 year normal.
They chose 30 years because 30 is considered a statistically representative sample size (small n) for any population (large N). The WMO calculated the first 30 year normal for 1931 – 1960 because that was the first period they considered they had adequate instrumental data. They have changed the Normal ever since on the assumption the record has improved since which is false. The latest is the period 1981 to 2010 The problem is they have reduced the number of stations since 1960 and especially after 1990.
Apart from those limitations, the length of the record is also a diversion. All of the IPCC projections have temperatures increasing. IPCC science assumes the temperature must increase if CO2 increases. According to their science, regardless of the length a decline is virtually impossible. But this where their duplicity catches up with them. The Science Report outlines the serious limitations of their work, but the Summary for Policymakers (SPM) tells generally uninformed media and public a very different story. The scientists knew of this duplicity, some as Lead Authors were involved in both.
They were the ones that convinced the media and the public of the certainty of their science. As David Wojick, UN IPCC expert reviewer, explained
“...What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.”
The IPCC process, methods and science are complete failures. They cannot be restructured because they began with a deliberately narrow definition of climate change. The IPCC must be eliminated and national weather bureaus, who make up most of the membership, should stop doing research. Research by a bureaucrat is almost guaranteed to be political, nowhere is that more evident than in the IPCC failures. It is exposed by the ugly fact that destroyed their hypothesis.
UPDATE: Jim Steele has posted this analysis on Watts Up With That that supports the notion that urbanization/bad siting/land use is responsible for much of the warming the last century significant cooling of the early century has elevated the apparent warming of recent decades.
The EPA admits to the Urban Heat island and suggests it will augment AGW by increasing the warmth in the cities. They never imagine that maybe man made warming is mostly UHI/bad siting, land use changes. Here is what the EPA says about UHI:
Heat island refers to urban air and surface temperatures that are higher than nearby rural areas. Many cities and suburbs have air temperatures that are 2 to 10 Fahrenheit (1 to 6 Celsius) warmer than the surrounding natural land cover.
Figure 1 shows a city’s heat island profile. It demonstrates how urban temperatures are typically lower at the urban-rural border than in dense downtown areas. The graphic also show how parks, open land, and bodies of water can create cooler areas].
Figure 1. Heat island profile. (Source: U.S. EPA) Enlarged
The remotely sensed image of Sacramento, California in Figure 2 illustrates the heat island phenomenon. In the aerial photo (left), the white areas, mostly rooftops, are about 140 degrees Fahrenheit (60 degrees Celsius) and the dark areas, primarily vegetative areas or water, are approximately 85-96 degrees Fahrenheit (29-36 degrees Celsius).
Figure 2. Thermally-sensed image of Sacramento. (Source: U.S EPA) Enlarged
The hottest spots are the buildings, seen as white rectangles of various sizes. In the thermal image (right), Sacramento’s rail yard is the orange area east of the Sacramento River, which flows from top to bottom. Red and yellow areas indicate hot spots and generally correspond with urban development, while blue and green areas are cool and generally correspond to the natural environment.
Cities in cold climates may actually benefit from the wintertime warming effect of heat islands. Warmer temperatures can reduce heating energy needs and may help melt ice and snow on roads. In the summertime, however, the same city may experience the negative effects of heat islands.
Causes of heat islands
The reason the city is warmer than the country comes down to a difference between the energy gains and losses of each region. There are a number of factors that contribute to the relative warmth of cities:
During the day in rural areas, the sunlight absorbed near the ground evaporates water from the vegetation and soil. Thus, while there is a net solar energy gain, this is compensated to some degree by evaporative cooling. In cities, where there is less vegetation, the buildings, streets and sidewalks absorb the majority of solar energy input.
Because the city has less water, runoff is greater in the cities because the pavements are largely nonporous (except by the pot holes). Thus, evaporative cooling is less, which contributes to the higher air temperatures.
Waste heat from city buildings, cars and trains is another factor contributing to the warm cities. Heat generated by these objects eventually makes its way into the atmosphere. This heat contribution can be as much as one-third of that received from solar energy.
The thermal properties of buildings add heat to the air by conduction. Tar, asphalt, brick and concrete are better conductors of heat than the vegetation of the rural area.
The canyon structure that tall buildings create enhances the warming. During the day, solar energy is trapped by multiple reflections off the buildings while the infrared heat losses are reduced by absorption.
The urban heat island effects can also be reduced by weather phenomena. The temperature difference between the city and surrounding areas is also a function of the synoptic scale winds. Strong winds reduce the temperature contrast by mixing together the city and rural air.
The urban heat island may also increase cloudiness and precipitation in the city, as a thermal circulation sets up between the city and surrounding region.
Heat islands can occur year-round during the day or night. Urban-rural temperature differences are often largest during calm, clear evenings. This is because rural areas cool off faster at night than cities, which retain much of the heat stored in roads, buildings, and other structures. As a result, the largest urban-rural temperature difference, or maximum heat island effect, is often three to five hours after sunset.
NOTE: See how the high temperatures are more uniform city and rural while nighttime readings are very different with the cities retaining heat.
In response, NOAA removed the UHI replacing it with algorithms that are supposed to detect previously undocumented site changes and by homogenization. This is what resulted - much more in line with the global.
NOAA NCDC presentation showing that as the EPA noted, most of the changes are in the nighttime readings where the cities hold onto the daytime heat. During the day, rural and urban areas come much more in line. The trend in the Tmax according to NOAA is weak with the PDO/AMO/TSI 60-70 year cycle remaining.
Want an extreme example of UHI - take Des Moines, IA where the official station at the airport in between two runways inside the urban areas and is always warmer than the surroundings even stations within but nearer the edge of the metro area. At 9:31pm Tuesday, August 27, 2013. it was still 88F in DSM although it was as cool as 72F in surrounding stations.
The real climate signal is in the well sited stations in rural areas. Instead of using these stations to adjust down the urban heat contaminated stations like DSM, the process of homogenization spreads the warmth to the good stations. See Dr Ed Long, formerly of NASA’s analysis that demonstrates this hereOn the Tmax issue, Dick McNider has been really advocating using Tmax in lieu of using Tmim or, therefore, Tmean. See most recently
I believe that UHI and land use change are a major component of the observed warming trend. Multidecadal cycles in the sun and oceans account for most all of the rest.
BTW, go here and see how Central Park NYC shows what a mess the UHI and versioning by NOAA of data has been and then follow it through the GISS window into station data to convince you there is no way jose we could hope to estimate global changes to a precision of 0.1 F. In the words of John von Neumann, father of the computer and of algorithms, “There’s no sense in being precise when you don’t even know what you’re talking about.”
The answer is the fruit of my labor, not the object of it. Because of that, you’ll look for anything to come up with the correct answer, not just a predetermined one where your self-esteem depends on it. Joe Bastardi
The last two weekends, I have featured perspectives on the climate debate from:
an engineer (Mike Haseler)
a physicist (Pierre Darriulat)
This week I feature a perspective from a weather forecaster, Joe Bastardi, which was published in the Patriot Post. I’ve received permission from Joe Bastardi to reproduce this in full:
I would love to debate Dr. Michael Mann. He’s a professor at Pennsylvania State University, and I’m a Penn. State Grad (Meteo. 1978). Enough people know me, as well as him, so we could charge a modest admission, fill Eisenhower Auditorium at PSU, and give all the money back to the PSU meteorology department whom I still love dearly in spite of my outcast status on the anthropogenic global warming issue.
But Dr. Mann would probably want no part of debating me on the main drivers of weather and climate given I have no higher degrees. C’mon, a BS in meteorology from PSU against this:
Alma mater: University of California, Berkeley, Yale University
This would be a blow out. What chance would I have?
Let me be clear: Dr. Mann’s resume, along with anyone who receives a PhD in the physical sciences, impresses me. I have read almost everything Dr. Mann has written and, because of that, I understand where he’s coming from. But there are things that are lacking if one is pursuing the right answer, and that’s the methodology one learns in putting together a forecast, as to how to weigh factors in determining what’s going to happen. One has to examine all of what his opponent has, not close his eyes to anything that might challenge his ideas.
For instance, while I’ve read almost everything Dr. Mann has written, how many times has he had hands on experience in making a forecast that has to verify? It’s laughable to think, as a private sector meteorologist whose livelihood depends on being right, that one can separate climate from weather. I realized a long time ago that being able to recognize current patterns from understanding the past (it was drilled into me by my father, a degreed meteorologist) was essential to making a good forecast. The fact many climatologists downplay the relationship, or say they’re different, shows me they don’t know what they’re talking about. In other words, I do what they do, but they don’t do what I do. I read what they write, but they won’t stop to look at the other side.
Perhaps it’s like something we sometimes see in sports, the curse of talent. Most of these people are very smart. I went to school with future PhDs and could see that in the classroom, they were like my wrestling coaches at PSU, guys that were great doing what came natural to them. However, my wrestling coach used to stress that when you’re used to having everything come to you, it’s very hard to change and step up your level. Consequently, you’ll get beat on your weakest point and what you don’t know, and that’s where the methodology in forecasting comes in to the climate debate.
You see, in what I do, one must weigh factors and decide which ones are most important. Additionally, one gets used to challenges that can never really be seen in research. How so? Suppose someone gives you a grant to study global warming. Can you come back and say, “My research says there’s no global warming”? You have been given a grant to produce a result; how can you possibly justify that result if it;s the result that would cost nothing to come up with in the first place?
In my line of work, getting paid (having clients) depends on the correct result. The client doesn’t say, “I want a cold winter, here’s the money, forecast it.” The client asks for a forecast that gives him an edge. If you are right, the client renews; if not, it’s bye bye. But there’s no up front money that looks for a set result. This means the forecaster does not care whether it’s warm or cold, just that he gets the right answer, whatever that may be. This is not the case in the AGW branch of academia. Research grants come with the cause du jour just try getting a grant to disprove global warming (actually, you don’t need one; it’s easy to refute it just by understanding what’s happened before).
That said, regarding the climate debate, what factors am I looking at to come up with my conclusion? To me, this is a big forecast, and the simple answer is: It’s hard to fathom that CO2 can cause anything beyond its assigned “boxed in” value to temperatures because of all that’s around it. It comes down to the sun, the oceans and stochastic events over a long period of time with action and reaction, versus a compound comprising .04% of the atmosphere and 1/100th of greenhouse gasses.
But unless you work every day in a situation where you are reminded you can be wrong, you don’t have appreciation for the methodology of challenge and response you need to be right!
Then there’s another big problem: What if you have all this knowledge, you’ve taken a stand on this, and it’s your whole life, how can you possibly be objective? The climate debate and past weather events are needed building blocks for my product. That product involves a challenge each day. In the case of a PhD on the AGW side, they believe the idea is the product. Destroy the idea, you destroy the product; destroy the product, you destroy the person. Therefore, it’s personal. Your whole life all the fawning students, the rock star status is all gone. I would hate to be in that position. Each day I get up, and there it is the weather challenging me. The answer is the fruit of my labor, not the object of it. Because of that, you’ll look for anything to come up with the correct answer, not just a predetermined one where your self-esteem depends on it.
So these giants of science have a fundamental problem, and it runs contrary to their nature. In the end, the very talent and brilliance of a lot of these people may be what blinds them to what it takes to truly pursue the truth.
JC comment: Bastardi raises a critical point, regarding the issue of forecasting as it relates to climate science. Until recently, the public and policy makers were content to consider projections of future climate that depended only on scenarios of future greenhouse emissions. Since the climate models and observations agreed during the last quarter of the 20th century as portrayed in highly confident attribution analyses, these scenario projections were treated by many as forecasts, including the IPCC, who expected a temperature increase of 0.2C/decade in the first few decades of the 21st century.
The growing divergence of climate model simulations and observations in the 21st century is leading to the growing realization among scientists, policy makers and the public that other factors are important in determining climate on decadal and multidecadal timescales. The IPCC dismisses this as unpredictable internal climate variability, unpredictable solar variability, unpredictable volcanic activity. Well, this is good enough only for scientists that are only interested in the CO2 impact on climate, but not for the public and policy makers (paying the bills for all this climate research) that want to know how the climate will actually evolve over the the 21st century.
Here is an analogy from my personal experience. My company CFAN started making hurricane forecasts in 2007 for a major oil company, who wanted advance knowledge (better than market and NOAA) of Gulf hurricane activity. We had devised a scheme that predicted the formation of hurricanes from African Easterly Waves, up to a week in advance. We had some major successes in our first season, notably our forecasts for Hurricane Dean and TS Erin (which I understand made them a lot of money in natural gas trading), but we completely failed (along with everybody else) to predict the formation Hurricane Humberto. Humberto formed near the Texas coast and rapidly intensified. This was not picked up by our prediction scheme, since Humberto did not form from an African Easterly Wave. Well, telling our clients that this kind of hurricane just isn’t predictable wasn’t going to be good enough for our clients. So we embarked on a research project to figure out what kind of predictability there was for this type of storm, and developed a probabilistic warning scheme with different scenarios for this type of storm.
The point is this: climate modelling needs to move towards actually predicting future climate variability change. The initialized decadal forecasts are a step in the right direction, but we need scenarios of future volcanic and solar activity as well (not to mention more research needed to figure out the sun-climate connections). Having climate modelers work on the seasonal climate forecast problem, and watching their forecasts fail to verify, would be invaluable experience for climate modelers making the productions runs for CMIP/IPCC.
And finally, a remark about Bastardi’s invitation to Mann to debate, which is captured in these tweets:
Michael E. Mann @MichaelEMann16 Oct
#JoeBastardi (http://www.desmogblog.com/joe-bastardi ) and #AnthonyWatts (http://sourcewatch.org/index.php?title=Anthony_Watts...): The best that #ClimateChange #Denial has to offer!
Joe Bastardi @BigJoeBastardi16 Oct
@MichaelEMann Anytime you would like to debate me and have proceeds go to PSU met, set it up. Eisenhower Auditorium will see who knows what
Rich Fraser @richmanwisco16 Oct
@BigJoeBastardi @MichaelEMann To debate Bastardi would be granting him the false balance that he craves but does not deserve.Retweeted by Michael E. Mann
Based upon Mann’s retweet, I don’t expect him to debate Bastardi. I note that if a Georgia Tech alumnus wanted to debate me or otherwise meet me, i would offer to go to lunch with them to discuss. In fact, in response to this article in the Georgia Tech Alumni magazine, I was invited to lunch by an alum to discuss climate change. There were 4 of us at lunch. Towards the end of the lunch, the alum admitted that the invite was intended as sort of an ambush, intended to trip me up as they presented all sorts of skeptical arguments. He said that they were delighted to have such an open and honest discussion about the issue, and that the learned a lot from talking with me. The following week, the School of Earth and Atmospheric Sciences received a check from the alum for $10K.
So I encourage Mann to at least meet with Bastardi to discuss. But imagine a public debate or discussion or Q&A between Mann and Bastardi. That would be an event I would pay to see (well I wouldn’t travel to Penn State, but I would pay to watch it on the internet). Since Mann has only joined the Penn State faculty within the last decade, there are generations of Meteorology alums who have not been exposed to his wisdom. He could hold a book signing etc. Sounds like this event could be a real winner for Penn State in terms of alumni relations and fund raising.
Oct 07, 2013
Snow season off to a roaring start
The winter snow season is off to a roaring start. A record fall snowstorm hit the Black Hills Friday and Saturday with up to 48 inches of snow at Deadwood and winds that reached 71 mph at Ellsworth AFB.
Lead, SD has 43.5 inches. Lead, SD gets an average of 197.5 inches of snow a year. But as for daily records, this will rank 4th largest on record behind 1973, 2006, 2008. It is the biggest early October storm since 1982.See the clean up in this storm chaser video.
The snow is increasing rapidly in North America and Asia. Running well ahead of last year. Recall that the arctic ice increased almost 60% over 2012.
Some are trying to blame the snow and winter cold on the lack of arctic ice. Of course trying to blame the late winter and spring blizzards last year to the lack of ice that had returned by the previous October should raise a question in any truly objective mind.The early snows following the recovery to arctic ice and the coldest summer on record in the arctic according to DMI questions that theory. As you know we attribute the cold winters and increased hemispheric snows to the +AMO and negative PDO (with help from low solar and high latitude volcanos) in recent winters. The positive AMO favors a negative AO and NAO which delivers the cold and snow.
UPDATE: See a full scientific evisceration of the National Geographic story here by Dr. Don Easterbrook. See also Nils Axel Morner add to Don’s post here.
In the September 2013 issue of National Geographic, the feature story is on rising sea levels and how they are changing our coastlines. It shows a Statue of Liberty half submerged. The magazine in at least the past decade has adopted the failing climate change advocacy position prevalent in today’s mainstream media. It has become more science fiction than science fact. It is sad because it once was a very popular very balanced and informative trustworthy magazine. A lot of the hype on sea level rises including talk by Mayor Bloomberg of the need to spend $20 billion dollars to protect the city from rising seas and storms is based on faulty data.
The entire environmental movement is based on flawed theories and models. Whenever the data disagrees with the data they assume the data is wrong and adjust it to fit their projections.
The late, great Richard Feynman, a Cornell Physicist said about the scientific method that if data or experiments don’t support your theory no matter how beautiful it is or smart you are, the theory is wrong.
For example, global temperatures stopped warming close to 17 years ago and have cooled since 2002. This is while CO2 has increased over 11%. None of the climate change models used by the UN showed this hiatus. Claims of the warmest decade and very high ranking months and years are based entirely on our government manipulating climate data. In 1999 when NASA’s James Hansen observed relative to the US annual temperatures in the 20th century, that the 1930s was clearly the warmest decade and 1934 the warmest year, 1934 was a full 1.1F warmer than the spike in the super El Nino of 1998 in the NOAA’s/NASA’s prize US data set which adjusted for urban contamination.
That was inconvenient since their global data set that was not adjusted for urbanization was showing significant warming over the same period. NOAA in 2007 removed the urban adjustment and changed other processing steps. The result was now that 1998 became 0.2F warmer than 1934, a change of 1.3F. One data set that was not altered, the state all time record highs and lows, showed a very different story more like that depicted in 1999. 39 of the 50 all time state record highs occurred before 1960. The most, 23, occurred in the 1930s. More state record cold records than warm have been set since the 1940s.
NOAA NCDC data compiled by Dr. John Christy for senate testimony in August 2012
The government loves to reinvent statistics. Does anyone really believe our real unemployment number is 7.3%? We get there by not counting people hopelessly unable to find employment. The CPI each month implies inflation is under control. But they exclude ‘volatile food and energy’. Sure boxer short prices are not rising at an alarming rate but gasoline is double what it was when Obama took office and a tank of heating oil may soon require a second mortgage. We shoppers all experience sticker shock when they go to buy package of chop meat. Sadly, this affects the poor and middle class the most as food and energy is what they spend the most money on. Europe went through this same green madness and is now abandoning it. That is not to say coproations don’t manipulate data when big money is involved - think Enron, MF Global, Bernie Madoff, and big pharma, but this is bigger and worse because it affects everyone.
But not the National Geographic, which had abdicated the once honest science for junk science advocacy. Let’s look at the facts.
Sea levels rise and fall as ocean temperatures rise and fall (causing expansion and contraction of the water) and as water is locked up in or increased in the major ice sheets in Antarctica and Greenland or during major glacial periods on the continents.
During the last glacial period, an ice sheet as much as 2 miles thick covered many parts of northern Europe and Asia and North America over Canada and the northern United States down to New York and south of Chicago. When the interglacial began and the ice retreated, meltwater caused a rapid sea level rise of 360 feet. In the last 8000 years sea level rise slowed to a crawl.
This figure shows sea level rise since the end of the last glacial episode based on data from Fleming et al. 1998, Fleming 2000, & Milne et al. 2005.
It likely varied with the cooling and warming periods that occur naturally. Global sea levels temporarily rise more when the ocean enter their warmer multidecadal phases and during major El Ninos and slow when the ocean basins cool and during major La Ninas.
Antarctica has been growing for 30 years, locking up more of the planet’s water in the icesheet. Greenland was 4C (8F) warmer than today’s level during the last interglacial without melting. With the Atlantic soon the head into its multidecadal cold mode and the sun into a 200 year slumber, cooling is likely to increase ice in Greenland. We already are seeing more snow in winter on land. 4 of the top 5 snowiest years for the hemisphere have occurred in the last 6 years.
Renowned oceanographic expert Nils-Axel Morner has studied sea level and its effects on coastal areas for some 45 years. Recently retired as director of the Paleogeophysics and Geodynamics Department at Stockholm University, Morner is past president (1999-2003) of the INQUA Commission on Sea Level Changes and Coastal Evolution, and leader of the Maldives Sea Level Project. In a 2010 paper in 21st Century Science and Technology Morner said
“While the IPCC and its boy scouts present wilder and wilder sea level predictions for the near future, the real observational facts demonstrate that sea level has remained virtually stable for the last 40-50 years.”
This is in sharp contrast with the model projections, similar to the way temperatures are defying the models.
Measurement sea level change at any location with tide gauges is complicated by the fact that the land is in many places sinking or rising. The mean of all the 159 NOAA tide gauge sites gives a rate of 0.5 mm/ year to 0.6 mm/year (Burton 2010). When you exclude those sites that represent uplifted and subsided areas, you are left with 68 sites of reasonable stability. These sites give a present rate of sea level rise in the order of 1.0 (+/- 1.0) mm/year (about 4 inches per century). Morner noted that most tide-gauges are installed on unstable harbor constructions or landing piers. Therefore, tide-gauge records are bound to exaggerate sea level rise.
A paper by Wenzel etal (2010) using a neural network on tide gauges found a mean sea level rise of 1.56 +/- 0.25 mm.yr from 1900 to 2006. They found the sea level changes are dominated by oscillations with periods of about 50-75 years, which relate nicely to the 60 year ocean oscillations.
Satellite altimetry offers the reconstruction of sea level changes all over the ocean surface. The technology though has also produced disappointing results to the alarmist community as it has not shown the rise they expected.
The Topex/Poseidon and later Jason missions recorded the variations of the ocean surface with high resolution. Having applied all technical correction needed, Menard 2000 and also Aviso 2000) presented a linear trend of 1.0 mm/year from 1992 to 2000. However, the rise came mostly from a spike due to the ocean warming from a super El Nino in 1997/98. Eliminating that spike, gave as change of 0 +/- 10mm. This graph provides no indication of any rise over the time-period covered (Morner 2004, 2007a, 2007c).
Morner presented this trend analysis that treated the 1997 El Nino peak (yellow) as a separate event superimposed on the long term trend. This shows a stability over the first 5 years blue and possibly over the whole time period covered (from Morner 2004, 2007c).
The IPCC in 2007 was actually conservative (an average of 15 inches) with their projections compared with others who projected a change of 3 feet up to 20 feet or more. If you recall in the movie, an inconvenient truth Al Gore used a crane to demonstrate how high 20 feet was. It appears, he could have instead stood on the Manhattan phone book.
But even the IPCC played data games. The IPCC combined tide gauge and altimetry and to their alarm, showed no change. But they made an ‘adjustment’ to the data using the sea level change from one of four Hong Kong gauges. It was the only one that showed a sea level rise, indicating that the land was likely subsiding there. At the Moscow global warming meeting in 2005, in answer to Morner’s criticisms about this “correction,” one of the persons in the British IPCC delegation said, “We had to do so, otherwise there would not be any trend.”
Satellite altimetry data also shows no consistent upward trend and to resolve this dilemma, corrections were applied including corrections assuming that land is still rebounding from the retreat of the glaciers 10,500 years ago (glacial isostatic adjustment).
So what does all this data suggest for the National Geographic scenario of a 200 foot rise?
Of course if the recent sea level rise has been 0 and it continues so, it will never reach the level depicted in the National Geographic. If it is Morner’s 1 mm/year, which the tide gauges, without accounting for the 1997/98 El Nino spike suggest, it would take 65,200 years. If the rise of 7 inches the last century repeats, it would take 36,629 years. If the mid range 15 inches predicted by the IPCC in AR4 remains in the Fifth assessment and were to verify, it would take 17,112 years.
Considering the fact we are already well past 10,000 years into the current interglacial and interglacials historically ranged from 10-15,000 years, under none of these scenarios would we ever see the Lady Liberty swimming. In fact we are more likely to be able to walk to the statue on a frozen sea/river or encased in ice.
Los Angeles Times endorses censorship with ban on letters from climate skeptics
By Professor J. Scott Armstrong
Published October 18, 2013
Censorship of skeptic global warming views by the press has been going on for many years. This week, Paul Thornton, letters editor for the Los Angeles Times announced the paper will “no longer publish letters from climate change deniers,” as reported by Poynter.org.
Thornton says, “Simply put, I do my best to keep errors of fact off the letters page; when one does run, a correction is published. Saying ‘there’s no sign humans have caused climate change’ is not stating an opinion, it’s asserting a factual inaccuracy.”
Really? Is this kind of censorship good public-service policy for the Los Angeles Times?
It is a good policy for the global warming alarmist movement because those who are more knowledgeable about climate change are more likely to dismiss the alarm as unfounded.
It is not so good for citizens who would otherwise benefit from the freedom to make up their own minds after being exposed to different arguments and diverse evidence.
Is such censorship good business for newspapers and other mass media? Given that most people in the U.S. do not believe that there is a global warming problem, this seems doubtful.
One-sided coverage loses readers who do not share the editorial viewpoint. Aristotle suggested that persuasiveness is higher when both sides of an issue are presented. Later research found that Aristotle’s suggestion only works when one can rebut the other side.
Failing that, it is best to try to prevent the other side from being heard.
If persuasion is the goal, and not science, then it is sensible for the warming alarmists to avoid two-sided discussions.
In our study of situations that are analogous to the current alarm over global warming, Kesten Green and I identified 26 earlier movements based on scenarios of manmade disaster (including the global cooling alarm in the 1960s). None of them were based on scientific forecasts. And yet, governments imposed costly policies in response to 23 of them.
In no case did the forecast of major harm come true.
Will it be different this time?
Isn’t it important for the public to be informed about scientific evidence on the issue? And because the alarm is based on the fear of future harm, shouldn’t the public insist on scientific forecasts?
The UN’s Intergovernmental Panel on Climate Change (IPCC) uses models that provide computer scenarios, not forecasts, of dangerous manmade global warming.
When we assessed the scenarios as if they were forecasts of what would actually happen, Kesten Green and I found that they violated 72 of 89 relevant scientific forecasting principles.
Would you go ahead with your flight, if you overheard two of the ground crew discussing how the pilot had skipped 80 percent of the pre-flight safety checklist?
For rational policy-making and regulating, scientific forecasts are necessary.
We are astonished that there is only one published peer-reviewed paper that claims to provide scientific forecasts of long-range global mean temperatures. The paper is a 2009 article in the International Journal of Forecasting by Kesten Green, Willie Soon, and me.
When we tested our forecasts against the IPCC scenarios using data from 1850 to the present, our forecasts, based on a model that adhered to scientific principles, were more accurate over all forecast horizons from 1 to 100 years. They were especially more accurate for long-term forecasts.
For example for forecasts 91 to 100 years ahead, the IPCC forecast errors were over 12 times larger than our forecast errors. Perhaps that qualifies as relevant evidence for citizens. And it would be “news” for 99% of them. Yet our forecasts received virtually no mass media coverage. Meanwhile, non-scientific climate-scare “forecasts” regularly get widespread attention from the mass media.
Want to bet which forecast of global mean temperatures is going to be correct? Mr. Gore did not want to bet against me in 2007 when he was warning that the world was at a climate “tipping point.” That was wise decision on his part. Scientific forecasting methods tend to be more accurate than political forecasting methods.
Fortunately, with many mass media outlets attempting to influence people by using censorship, citizens are able turn to alternative sources of information and argument on the Internet to inform their decisions. And many have. The polls provide evidence that the alarmist case is so weak that even with widespread censorship, citizens are not persuaded.
Professor J. Scott Armstrong teaches at the Wharton School of Business at the University of Pennsylvania in Philadelphia. He is a founder of the two major journals on forecasting methods, author of Long-Range Forecasting, editor of the Principles of Forecasting handbook, and founder of forecastingprinciples.com. He is the world’s most highly cited author on forecasting methods. He has been doing research on forecasting methods for over half a century. Fox News was the first to cover the bet that he proposed to Mr. Albert Gore in 2007 (see theclimatebet.com for the latest results).
The Los Angeles Times is giving the cold shoulder to global warming skeptics.
Paul Thornton, editor of the paper’s letters section, recently wrote a letter of his own, stating flatly that he won’t publish some letters from those skeptical of man’s role in our planet’s warming climate. In Thornton’s eyes, those people are often wrong and he doesn’t print obviously wrong statements.
“Simply put, I do my best to keep errors of fact off the letters page; when one does run, a correction is published,” Thornton wrote. Saying ‘there’s no sign humans have caused climate change’ is not stating an opinion, it’s asserting a factual inaccuracy.”
What amounts to a ban on discourse about climate change stirred outrage among scientists who have written exactly that sort of letter.
“In a word, the LA Times should be ashamed of itself,” William Happer, a physics professor at Princeton, told FoxNews.com.
“There was an effective embargo on alternative opinions, so making it official really does not change things,” said Jan Breslow, head of the Laboratory of Biochemical Genetics and Metabolism at The Rockefeller University in New York.
“The free press in the U.S. is trying to move the likelihood of presenting evidence on this issue from very low to impossible,” J. Scott Armstrong, co-founder of the Journal of Forecasting and a professor of marketing at the University of Pennsylvania’s Wharton School, told FoxNews.com.
Happer, Breslow and Armstrong are among 38 climate scientists that wrote a widely discussed letter titled “No Need to Panic About Global Warming,” which was published in The Wall Street Journal in Jan. 2012.
The letter argued that there was no compelling scientific argument for drastic action to “decarbonize” the world’s economy. It generated such extensive public debate about man’s role in global warming that the Journal published a second letter from the group a few weeks later.
Reached at his home on Friday, Thornton told FoxNews.com his policy was being misinterpreted.
“This is not a blanket ban on ‘skeptics.’ What it does ban is factual inaccuracy,” Thornton said. “I’ll put it this way: It’s fine to say that the Lakers are a terrible basketball team, but it would be factually inaccurate to say they’re bad because they put four guys out on the court every night instead of five. The latter ‘perspective’ also happen to be objectively false, so a letter containing it wouldn’t be considered for publication.”
“To say that no evidence exists when scientists have produced evidence is asserting a factual inaccuracy, and we try to keep errors of fact out of the paper,” he told FoxNews.com.
Thornton said he has already rejected letters that have argued that there is no evidence that human activity is driving climate change.
Other papers took up the LA Times cause, arguing that climate skeptics are too often kooks best kept off the pages and out of sight.
Citing a letter printed in an Australian newspaper, blogger Graham Readfearn of the Guardian suggested that he supported the ban.
“Wrongheaded and simplistic views like this are a regular feature on...no doubt hundreds of other newspapers around the world where readers respond to stories about climate change,” Readfearn wrote. “Thornton’s decision could well leave a few editors wondering if they should follow suit.”
Some climate skeptics said the move was an intentional effort to eliminate debate.
“My research on persuasion shows that persuasiveness of messages is higher when both sides of an issue are presented, but only when one has good arguments to defeat the other side,’ Armstrong told FoxNews.com. “If not, it is best to try to prevent the other side from being heard.”
The Poynter Institute, a journalism school in St. Petersburgh, Fl., took quizzical note of the policy in a post on its website on Wednesday. But editors for the school’s website did not acknowledge FoxNews.com questions about the ethics of such a policy, and Thornton himself did not respond to FoxNews.com in time for this article.
The writers of the Journal letter left no doubt about their feelings.
“The religion of Catastrophic Anthropogenic Global Warming (CAGW) does not tolerate non-believers,” Breslow told FoxNews.com.
Note the region north of New Guinea where the trend is 9 mm/year and the error is also 9 mm/year. This region severely skews the global numbers upwards. In other words, their 3mm/year trend claim is completely meaningless scientifically.
The online Bild newspaper here quotes German meteorologist Dominik Jung: “For the coming winter give a probability of 70:30% that it’s going to be a colder-than-normal winter.”
Photo: SnowKing1, licensed under Creative Commons Attribution-Share Alike 3.0 Unported.
Jung is not the only meteorologist forecasting a colder-than-normal winter. Bild also reports that also Michael Klein of Donnerwetter.de expects better than even chances of bitter cold months: “The first trend points more to a colder-than-normal winter. At least it is not going to be mild.”
The southern Germany’s online Wasserburger Stimme reports here that the chances of an ‘extreme winter’ are as high as ever this year. The site also quotes German meteorologist Dominik Jung, who says that winters in Central Europe in general began to become milder after the 1960s thanks to mild winds coming in off the Atlantic. Jung adds that the highpoint was reached in the 1990s. But: “In the last 5 years the winters have been normal or slightly colder than normal. The series of extremely mild winters appears to be over.”
A meteorologist who believes anything?
Jung feels that this winter could be a really harsh one, the sort not seen in 50 years. But cautions that such a forecast cannot be made with high certainty. Jung adds:
But the probability of a new XXL winter has increased considerably. New theories claim that extended ice melt at the North Pole could lead to significantly colder winters in Europe. It is not that the climate over the whole world will warm up. There are regions that will indeed get warmer, and there are also regions that will get cooler.”
Here Jung is subscribing to the after-the-fact nonsensical theory that the Potsdam Institute for Climate Impact Research (PIK) desperately had to cook after being caught off guard by cold winters that took Europe by surprise. It’s a dubious theory based solely on IPCC-quality models with no data to support it.
Jung adds that the US NCEP models are also forecasting a slightly colder normal winter than usual (1961-1990 mean).
UK meteorologist predicts possibly “the worst for decades”.
Forecasts of a harsh winter are also circulating in Britain. The online Express here bears the title: “Worst winter for decades: Record-breaking snow predicted for November.” It writes:
Long-range forecasters blamed the position of a fast-flowing band of air known as the jet stream near Britain and high pressure for the extreme conditions. Jonathan Powell, forecaster for Vantage Weather Services, said: ‘We are looking at a potentially paralysing winter, the worst for decades, which could at times grind the nation to a halt.
It’s hard to tell how much all of this is headline-grabbing. Some meteorologists may be getting carried away. My WAG is a more or less normal winter.
Greenpeace Arctic Activists Update
Complaints that the cell is too cold
The UK leftist Independent here writes that some of the 30 arrested Greenpeace activists are complaining of harsh conditions and that their cells are too cold. The Independent quotes the father of a detained activist:
‘There’s no regular access to such simple things as clean water, regular meals and a warm enough air temperature,’ Mr Golubok said.”
What do the activist expect from a Russian prison? Five-star resort accommodations, massages and a sauna?
Captain says he needs to go home because of heart problems
Meanwhile the English language Ria Novosti reports that the captain of the Arctic Sunrise ship Peter Wilcox now regrets going to the Arctic to protest:
RIA Novosti’s legal reporting agency RAPSI quoted Willcox as saying he had never faced such severe charges in 40 years of activism, and that he would have stayed in New York if he could choose to go back. [...] Denial of Willcox’s appeal came despite his lawyer expressing concern over his health complications stemming from a heart condition.”
Oh, now suddenly he has heart problems. Strange how just a month ago his heart had been in good enough shape to allow him to venture off into the harsh Arctic conditions for a little protest and boat ramming.
Lavrov warns external parties “not to interfere”
Yesterday Ria Novosti here wrote that Russian Foreign Minister Sergei Lavrov reiterated that the environmental activists “clearly violated the UN Convention on the Law of the Sea” and that “the issue should be left in the hands of the courts”. He also warned external parties “not to interfere”.
Ria Novosti writes that “All 30 have appealed their detention, seeking release on bail or home arrest. The Murmansk Regional Court has rejected all 11 appeals it has heard so far, ordering the activists to remain behind bars until a hearing on November 24.”
Looks to me like none are going to be released on bail. There’s just too much material to sort through and no one wants to decide prematurely. Besides, the Russians have lots of time and there’s no hurry.
Oct 11, 2013
Misguided energy policies have put Europe on a path to economic decline - democrats follow script
DESPITE rising atmospheric carbon dioxide levels, global climate temperatures have remained flat for the past 15 years, if not a good deal longer. And the UN’s Intergovernmental Panel on Climate Change (IPCC) was last month forced to admit that its own climate models have grossly overestimated climate sensitivity to atmospheric CO2.
So why are some European countries, Germany in particular, but also Britain and Denmark, pursuing green policies that are pushing up the cost of energy, and which could prove seriously damaging to their long-term economic health?
Existing policy in Germany already forces households to fork out for the second highest power costs in Europe, often as much as 30 per cent above the levels seen in other European countries. Only the Danes pay more, and residential electricity costs in both countries are roughly 300 per cent higher than in the US. Circumstances in Germany are only likely to worsen following the re-election of Angela Merkel’s conservative Christian Democratic Union. She will continue with policies designed to wean the country off fossil fuels and nuclear power.
But even former proponents are beginning to see the damage being caused. Dr Fritz Vahrenholt, a father of Germany’s environmental movement and former head of the renewable energy division at utility company RWE, has joined the ranks of those now challenging this trend. In his new book, The Neglected Sun Precludes Catastrophe, he concludes that “renewable energies do have a big future, but not like this. It’s been a runaway train and too expensive. We are putting [our] industry in jeopardy.”
Approximately 7.8 per cent of Germany’s electricity comes from wind, 4.5 per cent from solar, 7 per cent from biomass, and 4 per cent from hydro. The government plans to increase the proportion from renewables to 35 per cent by 2020, and 80 per cent by 2050. Since hydro and biomass won’t grow, most of that expansion must come from wind and solar.
Denmark, meanwhile, which produces between 20 to 30 per cent of its electricity from wind and solar, hopes to produce half from those sources by 2020. As Denmark can’t use all the electricity it produces at night, it exports about half of its extra supply to Norway and Sweden. But even with those export sales, government wind subsidies have led Danish consumers to pay the highest electricity rates in Europe.
And what about Britain? In 2011, UK wind turbines produced energy at a meagre 21 per cent of installed capacity (not demand capacity) during good conditions. As in Germany, unreliability in meeting power demands has necessitated importation of nuclear power from France. Also similar to Germany, the government is closing some of its older coal-fired plants, any one of which can produce nearly twice the electricity of Britain’s 3,000 wind turbines combined.
Renewables are unreliable, and power interruptions are adding to buyer’s remorse. This is less of a problem when there are reliable backup sources like hydropower, coal and nuclear plants to meet demand. But most of Europe lacks the former, and is intentionally to its detriment cutting back on both of the latter.
Signs of constructive change are far more apparent in Australia. In September, the right-of-centre Liberal Party’s defeat of the Green Party-backed Labor Party was recognised as a referendum victory for dismantling and consolidating myriad anti-carbon schemes spawned under the previous government. Reining in that sprawling climate machine and eliminating an established energy carbon tax is expected to save the economy more than Au$100m (57.6m pounds) per week. Australia is seeing sense, there are lessons Europe can learn from its shining example.
And thanks to natural gas, coal and nuclear, the US has excess power-generating capacity, and generally adequate transmission and distribution systems unlike Europe. Today, just over 42 per cent of US electrical power comes from coal, 25 per cent from natural gas, and 19 per cent from nuclear. Only about 3.4 per cent comes from wind, and about 0.11 per cent from solar.
Whether renewable energy will be able to offer substantial cost-competitive alternatives rather than limited niches for US and international energy remains to be seen. But regardless, we can only hope that America learns from the ruinous green energy policies in Germany and other EU nations before such misguided policies wreak further social and economic damage.
Larry Bell is a professor at the University of Houston where he directs the Sasakawa International Center for Space Architecture.
An Uncritical View Of EPA: Why I Agree With Obama
By Larry Bell, Forbes
Reprinted with permission from author
Okay, I admit that I’ve been pretty tough on the EPA in the past. Unlike most other Conservatives, I’ve now decided that I really do want clean air, land and water, and somebody’s got to do the dirty work of keeping capitalism from screwing it all up. Let there be no doubt that the agency takes every imaginable action to accomplish that.
Sure, maybe they might be faulted for going a little overboard occasionally in protecting us from ravages of excessive prosperity. Like, for example their regulatory efforts to put an end to many millions of years of climate change.
Still, we can all appreciate their good intentions. After all, think of all those critters that have frozen their butts off during previous Ice Ages lasting a hundred thousand years or so, then having to adapt to 12,000 to 15,000 years long warmer interglacial periods like our current one, then repeating the process all over again. Think about all those sweaty folks during the Medieval Warm Period a thousand years ago having to endure temperatures hotter than now, followed later by a “Little Ice Age” between about 1400-1859 AD causing Washington’s troops to suffer bitter cold at Valley Forge in 1777-78, as did Napoleon’s during their frigid retreat from Russia in 1812.
Heck, even during the past century alone the planet has had to endure two distinct climate changes...warming between 1900 and 1945 followed by a slight cool-down until 1975 when temperatures rose again at quite a constant rate until 1998.
So what if we humans really can prevent that climate change nuisance after all? Like be able to pick a time when temperatures are really great and there aren’t all those frequent severe weather events that past decades have witnessed?
Good news! It appears that we actually can. According to the UN’s last IPCC report, their scientists now claim they are virtually certain that we humans are primarily responsible for the past 17 years of flat global temperatures thanks to our record high atmospheric CO2 emissions.
Not only that, it seems that we have influenced the lack of increase in the strength or frequency of landfall hurricanes in the world’s five main hurricane basins over the past 50 to 70 years, and a lack of increase in the strength or frequency in tropical Atlantic hurricane development during the past 370 years. We’re also responsible for the longest U.S. period ever recorded without intense Category 3 to 5 hurricane landfall, and for no trend since 1950 of any increased frequency of strong (F3-F5) U.S. tornadoes.
Obviously we’re doing something right.
Then there’s the matter of those rising oceans. Remember when presidential candidate Obama declared during his June 8, 2008 victory speech as Democratic Party nominee that his presidency will be “the moment when the rise of the oceans began to slow and our planet began to heal”?
If you think his red line in Syria was a big deal, you’ve certainly got to give him credit for drawing an ambitious line in the sand on that one. And even without Putin’s help, he’s held to that promise. Just as he said, the sea level rise hasn’t accelerated at all since the time he assumed control. Nope, it’s still increasing at the rate of about seven inches per century, just as it has ever since that Little Ice Age.
Granted, all this climate stuff is really complicated, and appearances of events can be deceiving. Imagine, for example the view from Al Gore’s new $9 million Montecito, California oceanfront villa, or from John Kerry’s yacht. It’s probably very difficult to discern whether the sky is falling, or it’s the ocean that is rising.
Whichever the case, after subtracting subsidence due to human water depletion from coastal water level changes, that lowering in addition to a seven inches-plus per century sea rise difference clearly does present a serious problem in some locations. However, the question is how much help EPA can be expected to afford in solving it.
Here’s a thought. What if the federal government got out of the business of using our taxes to subsidize cheap flood insurance in vulnerable low-lying areas which encourages irresponsible coastal development in the first place?
Yeah, I know. That’s a pretty radical idea.
Regardless, undaunted by reality, the EPA is determined to protect us, come hell and/or high water. It’s termed the “precautionary principle”, where regulatory economic costs and impacts aren’t their concern. Don’t believe me?
In 2011, the American Council for Capital Formation estimated that the new EPA regulations will result in 476,000 to 1,400,000 lost jobs by the end of 2014. Management Information Services, Inc. predicted that up to 2.5 million jobs will be sacrificed, annual household income could decrease by $1,200, and gasoline and residential electricity prices may increase 50% by 2030. The Heritage Foundation projects that the greenhouse gas regulations will cost nearly $7 trillion (2008 dollars) in economic output by 2029.
Yet EPA representatives have maintained that considerations regarding such regulatory economic and employment impacts fall outside the administration’s purview. Responding in a letter to a question raised by Rep. Vicky Hartzler (R-Mo), then-EPA Assistant Administrator Gina McCarthy (who now heads the agency) was very clear on this point, stating “Under the Clean Air Act, decisions regarding the National Ambient Air Quality Standards (NAAQS) must be based solely on evaluation of the scientific evidence as it pertains to health and environmental effects. Thus, the agency is prohibited from considering costs in setting the NAAQS.”
The U.S. Government Accounting Office has stated that it can’t figure out what benefits taxpayers are getting from the many billions of dollars spent each year on policies that are purportedly aimed at addressing climate change. Another 2011 GAO report noted that while annual federal funding for such activities has been increasing substantially, there is a lack of shared understanding of strategic priorities among the various responsible agency officials. This assessment agrees with the conclusions of a 2008 Congressional Research Service analysis which found no “overarching policy goal for climate change that guides the programs funded or the priorities among programs.”
As noted by H. Sterling Burnett, a senior fellow at the National Center for Policy Analysis (NCPA), “The EPA is in the process of codifying a whole slate of new air quality rules, the sheer number and economic impact of which have not been seen at any time in the EPA’s history.”
EPA’s relentless regulatory war is centrally targeted against fossil energy, applying permitting authority claimed through an “Endangerment Finding” under auspices of its Clean Air Act. That finding found that current and projected atmospheric concentrations of six greenhouse gases (including CO2) “threaten the public health and welfare of current and future generations.”
And how was that finding determined? Perhaps you correctly guessed that it was based upon global warming crisis projections put forth by the UN’s IPCC.
Yet that IPCC alarmism which Administrator Lisa Jackson admitted the agency’s finding was based upon was refuted at the time by EPA’s own “Internal Study on Climate” report conclusions. That report, authored by my friend Alan Carlin, a senior research analyst at EPA’s National Center for Environmental Economics, stated: “...given the downward trend in temperatures since 1998 (which some think will continue until at least 2030), there is no particular reason to rush into decisions based upon a scientific hypothesis that does not appear to explain most of the available data.”
The EPA’s latest climate battle plan is to prohibit construction of new coal-fired power plants that can’t achieve 1,100 pound per megawatt hour carbon emission limits. To accomplish this will require plant operators to capture and store ("sequester") excess CO2, something that cannot be accomplished through affordable means, if at all. The Institute for Energy Research has estimated that this “regulatory assault” will eliminate 35 gigawatts of electrical generating capacity...10% of all U.S. power. As the Competitive Enterprise Institute observes, “If the carbon dioxide emissions standard for power plants proposed by the EPA today is enacted, the United States will have built its final coal-fired power plant.”
On September 18, Lisa Jackson’s replacement Gina McCarthy was invited to explain President Obama’s “Climate Plan” war on CO2 to members of the House Energy and Commerce Committee. I recently wrote about a notable exchange that took place between McCarthy and Rep. Mike Pompeo (R-Kan.) who asked her if EPA greenhouse gas regulations would be expected to impact any of 26 outcome indicators defined on their website.
At one point Pompeo asked: “Do you think it would be reasonable to take the regulations you promulgated and link them to those 26 indicators that you have on your website? That this is how they impacted us?”
McCarthy responded: “It is unlikely that any specific one step is going to be seen as having a visible impact on any of those impacts, a visible change in any of those impacts. What I’m suggesting is that climate change [policy] has to be a broader array of actions that the U.S. and other folks in the international community take that make significant effort towards reducing greenhouse gases and mitigating the impacts of climate change.”
In other words, the plan is to lead other nations off the same economic cliff America is rapidly approaching, with EPA heading the charge.
Oh, I nearly forgot. You’re probably wondering at this point where it is that I agree with President Obama regarding an uncritical view of EPA. Well, didn’t his administration determine with regard to the federal government shutdown that 93% of EPA employees weren’t critical...classified as non-essential?
So when the Big Fed starts up again, as I fear it will, let’s apply that precautionary principle to avoid further economic damage and just retain the 7% of the EPA that we apparently really need to keep our air, land and water clean, and sequester the rest? Isn’t that a huge national debt we owe to ourselves?
Climate scientists are obsessed with carbon dioxide. The newly released Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) claims that “radiative forcing” from human-emitted CO2 is the leading driver of climate change. Carbon dioxide is blamed for everything from causing more droughts, floods, and hurricanes, to endangering polar bears and acidifying the oceans. But Earth’s climate is dominated by water, not carbon dioxide.
Earth’s water cycle encompasses the salt water of the oceans, the fresh water of rivers and lakes, and frozen icecaps and glaciers. It includes water flows within and between the oceans, atmosphere, and land, in the form of evaporation, precipitation, storms and weather. The water cycle contains enormous energy flows that shape Earth’s climate, temperature trends, and surface features. Water effects are orders of magnitude larger than the feared effects of carbon dioxide.
Sunlight falls directly on the Tropics, where much energy is absorbed, and indirectly on the Polar Regions, where less energy is absorbed. All weather on Earth is driven by a redistribution of heat from the Tropics to the Polar Regions. Evaporation creates massive tropical storm systems, which move heat energy north to cooler latitudes. Upper level winds, along with the storm fronts, cyclones, and ocean currents of Earth’s water cycle, redistribute heat energy from the Tropics to the Polar Regions.
The Pacific Ocean is Earth’s largest surface feature, covering one-third of the globe and large enough to contain all of Earth’s land masses with area remaining. Oceans have 250 times the mass of the atmosphere and can hold over 1,000 times the heat energy. Oceans have a powerful, yet little understood effect on Earth’s climate.
Even the greenhouse effect itself is dominated by water. Between 75 percent and 90 percent of Earth’s greenhouse effect is caused by water vapor and clouds.
Yet, the IPCC and today’s climate modelers propose that the “flea” wags “the dog.” The flea, of course, is carbon dioxide, and the dog, is the water cycle. The theory of man-made warming assumes a positive feedback from water vapor, forced by human emissions of greenhouse gases.
The argument is that, since warmer air can hold more moisture, atmospheric water vapor will increase as Earth warms. Since water vapor is a greenhouse gas, additional water vapor is presumed to add additional warming to that caused by CO2. In effect, the theory assumes that the carbon cycle is controlling the more powerful water cycle.
But for the last 15 years, Earth’s surface temperatures have failed to rise, despite rising atmospheric carbon dioxide. All climate models predicted a rapid rise in global temperatures, in conflict with actual measured data. Today’s models are often unable to predict weather conditions for a single season, let alone long-term climate trends.
An example is Atlantic hurricane prediction. In May, the National Oceanic and Atmospheric Administration (NOAA) issued its 2013 hurricane forecast, calling for an “active or extremely active” hurricane season. At that time, NOAA predicted 7 to 11 Atlantic hurricanes (storms with sustained wind speeds of 74 mph or higher). In August NOAA revised their forecast down to 6 to 9 hurricanes. We entered October with a count of only two hurricane-strength storms. Computer models are unable to accurately forecast one season of Earth’s water cycle in just one region.
The IPCC and proponents of the theory of man-made warming are stumped by the 15-year halt in global surface temperature rise. Dr. Kevin Trenberth hypothesizes that the heat energy from greenhouse gas forcing has gone into the deep oceans. If so, score one for the power of the oceans on climate change.
Others have noted the prevalence of La Nina conditions in the Pacific Ocean since 1998. During 1975 to 1998, when global temperatures were rising, the Pacific experienced more frequent warm El Nino events than the cooler La Ninas. But the Pacific Decadal Oscillation (PDO), a powerful temperature cycle in the North Pacific Ocean, moved into a cool phase about ten years ago. With the PDO in a cool phase, we now see more La Nina conditions. Maybe more La Ninas are the reason for the recent flat global temperatures. But if so, isn’t this evidence that ocean and water cycle effects are stronger than the effects of CO2?
ICECAP NOTE: Doesn’t the idea that the lack of warming is due to the cool phase of the PDO caused the current warming pause then cooling after 2002 also raise the question whether the warming 1979 to 1998 was due to the warm phase of the PDO and more El Ninos? BTW, the period from 1947 to 1978 was marked by cooling....and a cool PDO!”
Geologic evidence from past ice ages shows that atmospheric carbon dioxide increases follow, rather than precede, global temperature increases. As the oceans warm, they release CO2 into the atmosphere. Climate change is dominated by changes in the water cycle, driven by solar and gravitational forces, and carbon dioxide appears to play only a minor role.
Steve Goreham is Executive Director of the Climate Science Coalition of America and author of the new book The Mad, Mad, Mad World of Climatism: Mankind and Climate Change Mania.
Clive James has made another of his intermittent forays into the climate debate. In the course of a review of Brian Cox’s Science Britannica programme he had this to say:
Fronting Science Britannica on BBC Two, Professor Cox visited the Royal Society and Bletchley Park in his quest for examples of the scientific method. Finally he dropped in on the Royal Institution, where he and the editor of Nature puzzled together, but not very hard, over how there has come to be an “overwhelming scientific consensus” favouring the concept of dangerous man-made global warming.
Neither of them asked what kind of scientific consensus it was if, say, Freeman Dyson of the Princeton Institute of Advanced Studies declined to join it. Isn’t the overwhelming scientific consensus really just a consensus between climate scientists, and therefore no more impressive than the undoubted fact that one hundred percent of gymnasium attendants believe that regular exercise is vital to longevity?
I think James is mistaken actually. The overwhelming scientific consensus is, as shown by Cook et al, nothing more noteworthy than the everyday observations that carbon dioxide is a greenhouse gas and that increasing concentrations will make the planet warmer; the “dangerous” bit is unwarranted extrapolation. And as readers at BH are aware, the Royal Society heard a vigorous debate last week over the strength of aerosols’ influence on the climate, something that is critical to determining to what extent global warming is “dangerous”.
Nevertheless James’ remarks seem to have provoked the ire of the usual suspects:
Simon Singh: Sad to see Clive James buying into climate contrarians’ propaganda
Jim Al-Khalili: Shame his clever prose wasted on drivel
Tamsin (who I would not classify as a suspect, usual or otherwise) meanwhile seems to have done a bit of a reanalysis of the article and concluded that James has decided that climate scientists have ulterior motives. This looks as though it’s going to result in a letter of protest direct to James and possibly an open letter too.
It’s all a bit absurd if you ask me. James has observed, not unreasonably, that there are eminent people who think that the global warming thing is overdone. In similarly uncontroversial terms he has drawn attention to the fact that people, including even scientists, respond to economic incentives. That scientists have an economic incentive to find evidence in favour of global warming being a problem is undeniable. Every single man jack of the climatological community is engaged in that field because they have weighed the financial and non-financial benefits against alternative employments and have decided that climate science is what they want to do. While Tamsin says that climate scientists could get better-paid employment elsewhere, we know in fact that every climate scientist thinks the non-financial benefits of their field outweigh the financial disadvantages.
This doesn’t mean that global warming is a scam or that climatologists are all crooks; just that they do have an incentive. This is why Clive James is right to apply at least some kind of a discount to their opinions and to take heed, at least to some extent, of the “contrarian voices”; the ones at which the Simon Singhs of this world hurl their invective and which others strive so hard to silence.
Two Santa Barbara brothers accused of violating federal laws related to a no-fishing zone off San Miguel Island beat the charges in late August when a federal judge determined that the government presented insufficient evidence to prove the crime. The decision highlighted deficiencies in the vessel monitoring system (VMS) used by the National Oceanic and Atmospheric Administration (NOAA) to watch fishing boats and enforce the rules surrounding marine protected areas (MPAs), prompting an ongoing review of the system with changes likely on the way.
For longtime fishermen Jason and Shane Robinson, the decision saved them from paying more than $17,000 in fines, which is a relatively low amount compared to other penalties, in part because they were only charged with idling in an MPA too long, not for fishing there, which can bring fines of up to $140,000. But the case also revealed what they believe is an unfair culture of guilty until proved innocent when it comes to commercial fishing laws. “They threaten you based on the fact that it costs more to fight these than to accept a settlement,” said Jason. “That’s what they told me, and that’s how they did it. In my mind, this is their ATM machine.... It feels like extortion.”
The brothers were only able to fight the charges, which date back to a fishing trip they took on May 17, 2010, because attorney Rusty Brace of the Santa Barbara firm Hollister & Brace took on the case pro bono. Had he been tallying his time on this complicated matter, which he says the feds fought tooth and nail despite no hard evidence, the bill would have far exceeded the fines, costing perhaps as much as $80,000 when all was said and done.
Brace said that this was the first time he could find where the feds based their arguments solely on the VMS, which sends one signal per hour from every boat working the commercial ground fisheries of the West Coast to NOAA. Usually, said Brace, NOAA presents a witness or other evidence to bolster its charges. In this case, the Robinsons weren’t alerted to the fact that NOAA was going to charge them until 10 months after the alleged crime; at that point, they could not recall what they had been doing, but now believe they were probably crossing the MPA back and forth over the four to five hour period in question as they waited for their fishing gear to soak.
“It’s impossible to say what they did,” said Brace, which is basically what the judge determined, as well. Brace appreciates how difficult it must be for the government to watch the entire West Coast and the roughly 1,000 boats working the ground fisheries (the area close to the bottom where many fish swim), and he understands how VMS makes the monitoring somewhat possible, even if he finds the constant tracking somewhat oppressive. “It’s a great way to identify suspect behavior, but it’s not a viable way to prove a case,” said Brace. “You have to have other corroborating evidence.”
The government, meanwhile, is standing by its case. “NOAA believes it presented sufficient evidence to find by a preponderance of the evidence that the respondent committed the violations charged, but accepts the Administrative Law Judge’s finding to the contrary,” explained John Thibodeau, a communications specialist in NOAA’s Office of Law Enforcement. “Neither NMFS [National Marine Fisheries Service] nor its enforcement partners have the resources to effectively monitor the various restricted fishing areas off the California coast, so we must rely on the data provided by VMS to determine the activity of fishing vessels at sea.”
Due to the judgment, adjustments may be on the way. “NMFS is currently reviewing what changes, if any, will be required in light of the Initial Decision,” Thibodeau explained in an email. One idea, which will be discussed at the next Pacific Fishery Management Council meeting in November in Costa Mesa, is to increase the frequency of the VMS signaling to every 15 minutes rather than once an hour. The increased costs would likely be passed down to fishermen, though, who already pay about $50 a month for the VMS and could see that cost multiplied with the additional signals. “That could go up to $150 or $200 a month,” said Jason, who plans to speak at the upcoming hearing. “That’s pretty significant for us.”
Though he’s happy to have prevailed, Jason described the whole situation as “disheartening” and believes he would have been “steamrolled by the government” if it weren’t for the unpaid efforts of Brace, who has known the brothers’ father for years. “My brother and I spend a great deal of time researching the ever-changing regulations and have no intention of violating any regulations made,” said Jason. “There is no one out there who wants a healthy viable resource more then the people who depend on it to feed their family.”
Figure 1.4 of the Second Order Draft clearly showed the discrepancy between models and observations, though IPCC’s covering text reported otherwise. I discussed this in a post leading up to the IPCC Report, citing Ross McKitrick’s article in National Post and Reiner Grundmann’s post at Klimazweiberl. Needless to say, this diagram did not survive. Instead, IPCC replaced the damning (but accurate) diagram with a new diagram in which the inconsistency has been disappeared.
Here is Figure 1.4 of the Second Order Draft, showing post-AR4 observations outside the envelope of projections from the earlier IPCC assessment reports (see previous discussion here).
Figure 1. Second Order Draft Figure 1.4. Yellow arrows show digitization of cited Figure 10.26 of AR4.
Now here is the replacement graphic in the Approved Draft: this time, observed values are no longer outside the projection envelopes from the earlier reports. IPCC described it as follows:
Even though the projections from the models were never intended to be predictions over such a short time scale, the observations through 2012 generally fall within the projections made in all past assessments.
So how’d the observations move from outside the envelope to insider the envelope? It will take a little time to reconstruct the movements of the pea.
In the next figure, I’ve shown a blow-up of the new Figure 1.4 to a comparable timescale (1990-2015) as the Second Draft version. The scale of the Second Draft showed the discrepancy between models and observations much more clearly. I do not believe that IPCC’s decision to use a more obscure scale was accidental.
Enlarged. Figure 3. Detail of Figure 1.4 with annotation. Yellow dots- HadCRUT4 annual (including YTD 2013.)
First and most obviously, the envelope of AR4 projections is completely different in the new graphic. The Second Draft had described the source of the envelopes as follows:
The coloured shading shows the projected range of global annual mean near surface temperature change from 1990 to 2015 for models used in FAR (Scenario D and business-as-usual), SAR (IS92c/1.5 and IS92e/4.5), TAR (full range of TAR Figure 9.13(b) based on the GFDL_R15_a and DOE PCM parameter settings), and AR4 (A1B and A1T). ,,,
The [AR4] data used was obtained from Figure 10.26 in Chapter 10 of AR4 (provided by Malte Meinshausen). Annual means are used. The upper bound is given by the A1T scenario, the lower bound by the A1B scenario.
The envelope in the Second Draft figure can indeed be derived from AR4 Figure 10.26. In the next figure, I’ve shown the original panel of Figure 10.26 with observations overplotted, clearly showing the discrepancy. I’ve also shown the 2005, 2010 and 2015 envelope with red arrows (which I’ve transposed to other diagrams for reference). That observations fall outside the projection envelope of the AR4 figure is obvious.
The new IPCC graphic no longer cites an AR4 figure. Instead of the envelope presented in AR4, they now show a spaghetti graph of CMIP3 runs, of which they state:
For the AR4 results are presented as single model runs of the CMIP3 ensemble for the historical period from 1950 to 2000 (light grey lines) and for three scenarios (A2, A1B and B1) from 2001 to 2035. The bars at the right hand side of the graph show the full range given for 2035 for each assessment report. For the three SRES scenarios the bars show the CMIP3 ensemble mean and the likely range given by -40% to +60% of the mean as assessed in Meehl et al. (2007). The publication years of the assessment reports are shown. See Appendix 1. A for details on the data and calculations used to create this figure.
The temperature projections of the AR4 are presented for three SRES scenarios: B1, A1B and A2.
Annual mean anomalies relative to 1961 to 1990 of the individual CMIP3 ensemble simulations (as used inAR4 SPM Figure SPM5) are shown. One outlier has been eliminated based on the advice of the model developers because of the model drift that leads to an unrealistic temperature evolution. As assessed by Meehl et al. (2007), the likely-range for the temperature change is given by the ensemble mean temperature change +60% and -40% of the ensemble mean temperature change. Note that in the AR4 the uncertainty range was explicitly estimated for the end of the 21st century results. Here, it is shown for 2035. The time dependence of this range has been assessed in Knutti et al. (2008). The relative uncertainty is approximately constant over time in all estimates from different sources, except for the very early decades when natural variability is being considered (see Figure 3 in Knutti et al., 2008).
For the envelopes from the first three assessments, although they cite the same sources as the predecessor Second Draft Figure 1.4, the earlier projections have been shifted downwards relative to observations, so that the observations are now within the earlier projection envelopes. You can see this relatively clearly with the Second Assessment Report envelope: compare the two versions. At present, I have no idea how they purport to justify this.
None of this portion of the IPCC assessment is drawn from peer-reviewed material. Nor is it consistent with the documents sent to external reviewers.
The first, and arguably most important, part of the latest IPCC Assessment Report (AR5) will be released next week in Stockholm (IPCC). This is the report from Working Group 1, charged with evaluating the current state of knowledge on the physical science. The reports from WG2 (Impacts, Adaptation and Vulnerability) and WG3 (Mitigation) will follow next Spring, with the final Synthesis Report being launched in October in Copenhagen (host city to the ill-fated 2009 COP15 conference).
Despite attempts to control the spread of information, the blogosphere has inevitably been full of leaks and previews of what the reports will say. The WG1 report is the most important because the conclusions it comes to shape the entire exercise. If the authors were to conclude that Mankind’s contribution to the present phase of climate change was less important than previously thought, then the world would pay much less attention to the need to mitigate.
But it is clear that the opposite is true; the crucial statement from the current version of the Summary for Policymakers (the SPM, the only bit which most people actually read) is “It is extremely likely that human influence on climate caused more than half of the increase in global average surface temperature from 1951-2010.” The important thing to note here is that this is an increase in confidence since AR4, despite the trend in warming having fallen to the bottom end of the range predicted earlier (and below this by some estimations).
Rather naively, the IPCC leaked the SPM to ‘friendly’ journalists in an attempt to manage the news of the report’s launch. However, inevitably it ended up in less friendly hands and has come in for some fierce criticism. There is no room here to do justice to the ongoing debate which is now rising to a crescendo, but readers who want to get a flavour of it could do worse than read these blog postings: Leaked IPCC report discussed in the MSM and Excerpts from the leaked AR5 Summary for Policy Makers. The first is from Judith Curry, a thoughtful scientist who supports the enhanced greenhouse effect hypothesis but is often critical of the IPCC and some of its more enthusiastic supporters. The second is by Anthony Watts, a retired meteorologist and author of a sceptical but reasoned blog.
Judith Curry makes the point that, in light of the accrual of evidence over the five years since the publication of AR4, the increased confidence is unjustified (remember that the confidence levels are a matter of subjective judgement; there is no objective metric used). In her words “An increase in confidence in the attribution statement, in view of the recent pause and the lower confidence level in some of the supporting findings, is incomprehensible to me. Further, the projections of 21st century changes remain overconfident.”
This topic is vitally important for the future of all of us. If the IPCC’s confidence is justified then effective mitigation measures should be given high priority. The debate then moves from what the problem is to how best to solve it, whether by making the most cost-effective reductions in carbon dioxide emissions now (almost certainly with a large element of nuclear power), concentrating on adaptation or simply waiting until future generations have new, economic energy generating technologies which do not use fossil fuels. The IPCC seems to be trying to move the debate on and once again persuade leaders that ‘the science is settled’.
If, on the other hand, the WG1 authors’ confidence is misplaced, then the case for rapid and radical decarbonisation is undermined. The fact that there is no end in sight to China and India’s escalating use of coal might already be seen as making current policies futile, but it has not stopped the EU and a few other enthusiastic countries from imposing high costs on their own economies to reduce their own use of fossil fuels. If the IPCC’s apparent certainty can be shown to be unjustified, then pragmatic politicians are going to have to start questioning their emissions reduction policies.
To add to this mix, there is a recent report of further work by the Danish team led by Henrik Svensmark on the influence of the Sun’s magnetic field on cloud formation initiated by cosmic rays: Physicists claim further evidence of link between cosmic rays and cloud formation. What the latest experiments have shown is that the very fine aerosol particles produced by ionising radiation can aggregate in the presence of sulphuric acid (produced under the influence of ultraviolet light) to produce nuclei large enough to initiate cloud formation.
By itself, this if of course not enough to demonstrate that cosmic rays, mediated by the Sun’s changing magnetic field, have a significant effect on global temperatures relative to the forcing effect of atmospheric carbon dioxide. However, the evidence which is building cannot be ignored. This does not stop this hypothesis being effectively dismissed by the current climate change Establishment, who have consistently said that such an effect could only be minor. In this report for example, Gavin Schmidt, a leading spokesman, said “The researchers have a really long way to go before they can convince anyone that this is fundamental to climate change.”
This is undoubtedly true, but we should not forget that the entire edifice of climate change policy currently rests on the output of computer models based on the hypothesis that there are positive feedback mechanisms which increase the modest warming impact of higher CO2 levels. There is currently no empirical support for this and the temperature trends for the early part of the 21st Century are now incompatible with the projections from the models. It would be foolhardy to ignore hard evidence of other effects, even if IPCC scientists are dubious. Our future prosperity may depend on it.
"Rick Perry leaves a trail of death.” So reads the headline in a fake weather report, part of a new campaign to name hurricanes after noted climate change skeptics. The group, 350.org, hopes that associating politicians with destructive storms will make them more willing to enact restrictions on carbon emissions as a means of fighting global warming.
The campaign is tasteless, but it helps to highlight an otherwise largely overlooked fact: Hurricanes have been largely absent this year.
For the first time in 11 years, August came and went without a single hurricane forming in the Atlantic. The last intense hurricane (Category 3 or above) to hit the United States was Hurricane Wilma, in 2005. According to Phil Klotzbach, head of Colorado State University’s seasonal hurricane forecast, accumulated cyclone energy is 70 percent below normal this year.
Hurricanes have become a major part of the public relations campaign for radical action on climate change. After Hurricane Sandy hit the Eastern Seaboard last fall, the left quickly dubbed it a “Frankenstorm,” and nearly fell over itself attempting to claim that the intensity of the storm was a result of greenhouse gas emissions.
That’s not so surprising. Despite decades of effort, the environmental movement has largely failed to persuade the American public to accept the draconian restrictions that stopping climate change would entail, and linking hurricanes to climate change may be their best chance to change all that.
A look at the science, however, tells a somewhat different story. While the overall number of recorded hurricanes has increased since 1878 (when existing records begin), this is at least partly due to an improved ability to observe storms rather than an increase in the number of storms.
As Thomas Knutson of the National Oceanic and Atmospheric Administration noted recently, “the rising trend in Atlantic tropical storm counts is almost entirely due to increases in short-duration (less than 2-day) storms alone [which were] particularly likely to have been overlooked in the earlier parts of the record, as they would have had less opportunity for chance encounters with ship traffic.” As such, “the historical Atlantic hurricane record does not provide compelling evidence for a substantial greenhouse warming induced long-term increase.”
Similarly, the increase in damages from storms over time has less to do with their increased frequency or intensity than with the fact that we have gotten richer. Had Hurricane Sandy swept through New Jersey 100 years ago, it would have done far less damage simply because, back then, there was less of value to destroy. These days Americans are not only wealthier, but we are more inclined to build closer to the water, due to subsidized flood insurance. When University of Colorado professor Roger Pielke looked at the numbers, he found that correcting for these factors completely eliminated the supposed increase in hurricane damage.
Unsurprisingly, then, a leaked draft of the Fifth Assessment Report of the United Nations’ Intergovernmental Panel on Climate Change (due to be released later this month) downgraded the likelihood of a connection between past temperature rises and extreme weather events. According to the report, there is “low confidence” in any association between climate change and hurricane frequency or intensity.
The U.N. panel could, of course, be wrong. Congress recently held hearings examining the science behind climate change claims, and should continue to do so. In this case, however, the attempts to slander climate change skeptics by linking them to today’s storms is scientifically flawed to say the least.
Whenever a climate change conference is greeted by a record snowfall or cold snap, environmentalists are quick to point out that weather is not the same as climate. Yet when it comes to storms, many have been willing to fall into exactly the same trap.
Neeley is a policy analyst with the Texas Public Policy Foundation, an Austin-based nonprofit, free-market research institute.
See a full detailed scientific critique of the IPCC AR5 here.
he IPCC has retreated from at least 11 alarmist claims promulgated in its previous reports or by scientists prominently associated with the IPCC. The SPM also contains at least 13 misleading or untrue statements, and 11 further statements that are phrased in such a way that they mislead readers or misrepresent important aspects of the science.
U.N. panel has backed down from its previous forecasts of increases in droughts and hurricanes. And it admits, but does not explain, why no warming has occurred for the past 15 years.
Environmentalists hoped the latest study from the United Nations’ Intergovernmental Panel on Climate Change (IPCC) would finally end the increasingly acrimonious debate over the causes and consequences of climate change. It has had the opposite effect.
MIT physicist Richard Lindzen called the IPCC report “hilarious incoherence.” British historian Rupert Darwall labeled it “nonsense” and “the manipulation of science for political ends.” Judith Curry of the Georgia Institute of Technology says the IPCC suffers from “paradigm paralysis” and should be “put down.”
The most precise criticism of the IPCC’s report came from the editors of Nature, one of the world’s most distinguished science journals: “Scientists cannot say with any certainty what rate of warming might be expected, or what effects humanity might want to prepare for, hedge against or avoid at all costs.”
Despite decades of research funded by taxpayers to the tune of billions of dollars, we are no more certain about the impact of man-made greenhouse gases than we were in 1990, or even in 1979 when the National Academy of Sciences estimated the effect of a doubling of carbon dioxide to be “near 3 degrees C with a probable error of plus or minus 1.5 degrees C.”
The lower end of that range, which is where the best research on the likely sensitivity of climate to carbon dioxide lands, is well within the bounds of natural variability.
Significantly, the IPCC has backed down from its previous forecasts of increases in droughts and hurricanes. And it admits, but does not explain, why no warming has occurred for the past 15 years.
Due to its charter and sheer bureaucratic momentum, the IPCC is compelled to claim it is more confident than ever in its alarmist predictions, even as real-world evidence falsifies them at every turn. Policymakers and the public have no reason to believe this discredited oracle.
It’s time to start listening to other voices in the debate, such as the 50-some scientists who make up the Nongovernmental International Panel on Climate Change (NIPCC).
According to its latest report, “the IPCC has exaggerated the amount of warming likely to occur if the concentration of atmospheric carbon dioxide were to double, and such warming as occurs is likely to be modest and cause no net harm to the global environment or to human well-being.”
Joseph L. Bast is president of The Heartland Institute, a Chicago-based research organization that advocates free-market policies.
"This report and the build-up to it is a carefully choreographed self-referencing political game by Climate Change parasites which contains nothing of substance and is constructed to conceal the facts:
ALL THE DIRE PREDICTIONS of the CO2 warmists since 2000 have failed.
THE “ADMISSION” of ‘a pause in warming’ over the last 15 years is itself a cover-up for the fact that ONLY THEIR FRAUDULENT DATA shows any ‘warming’ at all in the period.
THEIR CLAIM that this pause was “something we (CO2 warmists) expected” is a brazen lie. They expected ‘runaway warming’
THEIR STATEMENT that the world has warmed over the last 30 years or so is merely an expression of the natural solar-lunar 60yr cycle of temperatures (and Pacific circulation) explained by WeatherAction in 2008 and nothing to do with CO2.
THEIR CLAIM that alleged CO2 warming due to a small rise in the atmospheric concentration (0.04%) of the trace gas, CO2, is somehow hidden in the deep ocean is scientific cretinism beyond reason, fact or observation.
THE CO2 “theory” has no predictive powers in weather or climate and while all it’s dire warnings have failed and it’s supposed scientific basis has been shown to be lacking the prognoses of the EVIDENCE-BASED Solar-Lunar science of WeatherAction and others over the last 7 years have been vindicated.
“IN THE NAME OF SCIENCE THE UN IPCC and all it stands for must be destroyed.”
ADD: This quote from the great H.L. Mencken captures perfectly the religious nature of those in the climate cult:
“The essence of science is that it is always willing to abandon a given idea, however fundamental it may seem to be, for a better one; the essence of theology is that it holds its truths to be eternal and immutable.”
The Arctic Ocean is warming up, icebergs are growing scarcer and in some places the seals are finding the water too hot, according to a report to the Commerce
Department yesterday from Consulafft, at Bergen, Norway
Reports from fishermen, seal hunters, and explorers all point to a radical change in climate conditions and hitherto unheard-of temperatures in the Arctic
zone. Exploration expeditions report that scarcely any ice has been met as far north as 81 degrees 29 minutes.
Soundings to a depth of 3,100 meters showed the gulf stream still very warm. Great masses of ice have been replaced by moraines of earth and stones, the
report continued, while at many points well known glaciers have entirely disappeared.
Very few seals and no white fish are found in the eastern Arctic, while vast shoals of herring and smelts which have never before ventured so far north, are
being encountered in the old seal fishing grounds. Within a few years it is predicted that due to the ice melt the sea will rise and make most coastal cities uninhabitable.
November 2, 1922, as reported by the AP and published in The Washington Post - 90+ years ago.
Warmist Kevin Drum on selling the global warming hoax: “...anecdotal evidence (mild winters, big hurricanes, wildfires, etc.) is probably our best bet. We should milk it for everything it’s worth” H/T Tom Nelson.
See Dr. Doug Hoyt’s Greenhouse Scorecard on Warwick Hughes site here.
From Jack Black’s Climate Change Dictionary
PEER REVIEW: The act of banding together a group of like-minded academics with a funding conflict of interest, for the purpose of squeezing out any research voices that threaten the multi-million dollar government grant gravy train.
SETTLED SCIENCE: Betrayal of the scientific method for politics or money or both.
DENIER: Anyone who suspects the truth.
CLIMATE CHANGE: What has been happening for billions of years, but should now be flogged to produce ‘panic for profit.’
NOBEL PEACE PRIZE: Leftist Nutcase Prize, unrelated to “Peace” in any meaningful way.
DATA, EVIDENCE: Unnecessary details. If anyone asks for this, see “DENIER,” above.
CLIMATE SCIENTIST: A person skilled in spouting obscure, scientific-sounding jargon that has the effect of deflecting requests for “DATA” by “DENIERS.’ Also skilled at affecting an aura of “Smartest Person in the Room” to buffalo gullible legislators and journalists.
JUNK SCIENCE: The use of invalid scientific evidence resulting in findings of causation which simply cannot be justified or understood from the standpoint of the current state of credible scientific or medical knowledge
Speaking of junk science, see Lubos Motl’s excellent point by point counter to the John Cook 104 talking points document attacking the skeptical science here.
NOTE: Heartland has the presentations and powerpoints posted for the Heartland ICCC IV. If you could not go, there is plenty to see there. Please remember the goldmine of videos and PPTs at the Heartland ICCC proceeding sites for 2008 NYC here, 2009 NYC here and 2009 DC here. Here is a PPT I gave at the Heartland Instutute ICCC Meeting in 2008 and here is the follow up in 2009. Here is an abbreviated PPT in two parts I presented at a UK conference last month: Part 1, Part 2.
See C3 Headlines excellent collection of graphs and charts that show AGW is nonsense here.
See Climate Theater with a collection of the best climate skeptic films and documentaries here. See additional scientific youtubes here.
“The above papers support skepticism of “man-made” global warming or the environmental or economic effects of. Addendums, comments, corrections, erratum, replies, responses and submitted papers are not included in the peer-reviewed paper count. These are included as references in defense of various papers. There are many more listings than just the 900-1000 papers. Ordering of the papers is alphabetical by title except for the Hockey Stick, Cosmic Rays and Solar sections which are chronological. This list will be updated and corrected as necessary.”
The less intelligent alarmists have written a paper allegedly connecting the scientists to Exxon Mobil. Here is the detailed response from some of the featured scientists. Note that though this continues to be a knee jerk reaction by some of the followers, there is no funding of skeptic causes by big oil BUT Exxon has funded Stanford warmists to the tune of $100 million and BP UC Berkeley to $500,000,000. Climategate emails showed CRU/Hadley soliciting oil dollars and receiving $23,000,000 in funding.
Many more papers are catalogued at Pete’s Place here.
The science and economics of global warming are not too complicated for the average person to consider and make up his or her own mind. We urge you to do that. Go here and view some of the articles linked under “What’s New” or “A Primer on Global Warming.” Or go here and read about the new report from the Nongovernmental International Panel on Climate Change (NIPCC), which comprehensively rebuts the claims of the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Go here for the sources for the factual statements in the ads.
See the ICECAP Amazon Book store. Icecap benefits with small commission for your purchases via this link.
Also available now some items that will gore your alarmist friends (part of the proceeds go to support Icecap):
The Weather Wiz here. See how they have added THE WIZ SCHOOL (UPPER LEFT) to their website. An excellent educational tool for teachers at all class levels. “Education is the kindling of a flame, not the filling of a vessel” - Socrates (470--399 BC)