The most significant uncertainties that must be dealt with to properly analyze temperature trends are detailed below. We touch on alarmist claims that are seen in greater detail in the sections of this document summarized in the January 21st blog post under Joe’s Blog.
-------------------
NOAA and NASA can be counted on virtually every month or year’s end to religiously and confidently proclaim that the latest global average surface temperature (GAST) is among the warmest on record. Back in the 1970s when an assessment of a global temperature was first attempted, the scientists recognized that even land-only surface temperature data was a significant challenge given that most of the reliable data was limited to populated areas of the U.S, Europe and eastern China with just spotty often intermittent data from vast land areas elsewhere.
Temperatures over oceans, which covered 71% of the globe, were measured along shipping routes mainly in the Northern Hemisphere erratically and with varying measurement methods. Despite these shortcomings and the fact that absolutely no credible grid level temperature data existed over the period from 1880 to 2000 in the Southern Hemisphere’s oceans (covering 80.9% of the Southern Hemisphere), global average surface temperature data estimation and publication by NOAA and NASA began in the early 1990s.
In this era of ever-improving technology and data systems, one would assume that measurements would be constantly improving. This is not the case with the global observing network. The world’s surface observing network had reached its golden era in the 1960s to 1980s, with more than 6,000 stations providing valuable climate information.
DATA DROPOUT
The number of weather stations providing data to GHCN plunged in 1990 and again in 2005 (as stations in the oversampled lower 48 states were thinned out). The sample size has fallen by over 75% from its peak in the early 1970s, and is now smaller than at any time since 1919. The collapse in sample size has increased the relative fraction of data coming from airports to 49 percent (up from about 30 percent in the 1970s). It has also reduced the average latitude of source data and removed relatively more high-altitude monitoring sites (McKitrick 2010).
We could show many regional or country examples but here is one, Canada. NOAA GHCN used only 35 of the 600 Canadian stations in 2009. Verity Jones plotted the stations from the full network rural, semi-rural and urban for Canada and the northern United States both in 1975 and again in 2009. She also marked with diamonds the stations used in the given year. Notice the good coverage in 1975 and very poor, virtually all in the south in 2009. Notice the lack of station coverage in the higher latitude Canadian region and arctic in 2009.
Enlarged
Canadian stations used in annual analyses in 1975 and 2009 (source: Verity Jones from GHCN).
Just one thermometer remains in the database for Canada for everything north of the 65th parallel. That station is Eureka, which has been described as “The Garden Spot of the Arctic” thanks to the flora and fauna abundant around the Eureka area, more so than anywhere else in the High Arctic. Winters are frigid but summers are slightly warmer than at other places in the Canadian Arctic.
Environment Canada reported in the National Post, that there are 1,400 stations in Canada with 100 north of the Arctic Circle, where GHCN includes just one.
MISSING MONTHLY DATA
After the 1980s, the network suffered not only from a loss of stations but an increase in missing monthly data. To fill in these large holes, data were extrapolated from greater distances away.
Forty percent of GHCN v2 stations have at least one missing month, It reached 90% in Africa and South America.
According to the World Meteorological Organization’s own criteria, followed by the NOAA’s National Weather Service, temperature sensors should be located on the instrument tower at 1.5 m (5 feet) above the surface of the ground. The tower should be on flat, horizontal ground surrounded by a clear surface, over grass or low vegetation kept less than 4 inches high. The tower should be at least 100 m (110 yards) from tall trees, or artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots.
Very few stations meet these criteria. The modernization of weather stations in the United States replaced many human observers with instruments that initially had warm biases (HO-83) and later cold biases (MMTS) or were designed for aviation and were not suitable for precise climate trend detection [Automates Surface Observing Systems (ASOS) and the Automated Weather Observing System (AWOS). Note the specifications required a RMSE of 0.8F and max error of 1.9F. ASOS was designed to supply key information for aviation such as ceiling visibility, wind, indications of thunder and icing. They were not designed for assessing climate.
Also, the new instrumentation was increasingly installed on unsuitable sites that did not meet the WMO’s criteria. During recent decades there has been a migration away from old instruments read by trained observers. These instruments were generally in shelters that were properly located over grassy surfaces and away from obstacles to ventilation and heat sources.
Today we have many more automated sensors (The MMTS) located on poles cabled to the electronic display in the observer’s home or office or at airports near the runway where the primary mission is aviation safety.
The installers of the MMTS instruments were often equipped with nothing more than a shovel. They were on a tight schedule and with little budget. They often encountered paved driveways or roads between the old sites and the buildings. They were in many cases forced to settle for installing the instruments close to the buildings, violating the government specifications in this or other ways.
Pielke and Davey (2005) found a majority of stations, including climate stations in eastern Colorado, did not meet WMO requirements for proper siting. They extensively documented poor siting and land-use change issues in numerous peer-reviewed papers, many summarized in the landmark paper “Unresolved issues with the assessment of multi-decadal global land surface temperature trends (2007).
In a volunteer survey project, Anthony Watts and his more than 650 volunteers at www.surfacestations.org found that over 900 of the first 1,067 stations surveyed in the 1,221 station U.S. climate network did not come close to the Climate Reference Network (CRN) criteria. 90% were sited in ways that result in errors exceeding 1C according to the CRN handbook.
Only about 3% met the ideal specification for siting. They found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. They found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas. In fact, they found that 90% of the stations fail to meet the National Weather Service’s own siting requirements that stations must be 30 m (about 100 feet) or more away from an artificial heating or reflecting source.
The average warm bias for inappropriately-sited stations exceeded 1C using the National Weather Service’s own criteria, with which the vast majority of stations did not comply.
In 2008, Joe D’Aleo asked NOAA’s Tom Karl about the problems with siting and about the plans for a higher quality Climate Reference Network (CRN at that time called NERON). Karl said he had presented a case for a more complete CRN network to NOAA but NOAA said it was unnecessary because they had invested in the more accurate satellite monitoring. The Climate Reference Network was capped at 114 stations and did not provide meaningful trend assessment for about 10 years. Here is the latest monthly time series - now 15 years.
BTW, in monthly press releases no satellite measurements are ever mentioned, although NOAA claimed that was the future of observations.
URBANIZATION/LAND USE CHANGES
The biggest issue though to accurate measurement is urbanization. Bad siting usually enhances the warming effect. Weather data from cities as collected by meteorological stations are indisputably contaminated by urban heat-island bias and land-use changes. This contamination has to be removed or adjusted for in order to accurately identify true background climatic changes or trends.
In cities, vertical walls, steel and concrete absorb the sun’s heat and are slow to cool at night. In surrounding suburban areas (often where airports are located), commercialization and increased population densities increase the temperatures at night relative to the surrounding rural areas. More and more of the world is urbanized (population increased from 1.5 B in 1900 to over 7.1 billion today.
The EPA depicts the typical temperature distribution from city center to rural, similar to the observed minimum temperature analysis surrounding London in mid May (about a 10F difference is shown).
Oke (1973) found a village with a population of 10 has a warm bias of 0.73C, a village with 100 has a warm bias of 1.46 C, a town with a population of 1000 people has a warm bias of 2.2 C, and a large city with a million people has a warm bias of 4.4C.
Zhou et al (2005) have shown global data bases (for China) not properly adjusted for urbanization. Block (2004) showed the same problem exists in central Europe. Hinkel et al (2003) showed even the village of Barrow, Alaska with a population of 4600 has shown a warming of 3.4F in winter over surrounding rural areas, These are but a handful of the dozens of studies documenting the UHI contamination.
Most confirm the warming is predominantly at night. During the day when the atmosphere is well mixed, the urban and rural areas are much the same. This analysis by in Critchfield (1983) for urban Vienna and suburban Hohe Warte shows the temperature traces for February and July.
Tom Karl whose paper in 1988 defined the UHI adjustment for the first version of USHCN (which was removed in version 2) wrote with Kukla and Gavin in a 1986 paper on Urban Warming:
“MeteoSecular trends of surface air temperature computed predominantly from urban station data are likely to have a serious warm bias… The average difference between trends (urban siting vs. rural) amounts to an annual warming rate of 0.34C/decade… The reason why the warming rate is considerably higher [may be] that the rate may have increased after the 1950s, commensurate with the large recent growth in and around airports.... Our results and those of others show that the urban growth inhomogeneity is serious and must be taken into account when assessing the reliability of temperature records.”
Inexplicably, the UHI adjustment Karl argued for was removed in USHCNv2.
Doug Hoyt, once chief scientist at Raytheon wrote: “It is not out of the realm of possibility that most of the twentieth century warming was urban heat islands.’
It continues to show up in the data. The nighttime temperatures the last 17 years (NASA AIRS) have warmed in the United States while daytime changes, the best measure of any warming have been very small.
As an example of before and after, the average annual temperatures for the state of Maine downloaded in 2010 before the change (-0.01F/decade) and after the change in 2012 (+0.23F/decade) says it all. We could provide literally hundreds of other examples. Bloggers in many other countries have shown startling examples of fraud.
This is needed when a station is missing data for a month or months. It is accomplished using anomalies. For areas where there are adequate close-by surrounding stations, the assumptions that despite the local temperature differences, most sites will have a similar anomaly (departure from normal) is a reasonable one. But for infilling they can go as far as 1200 km (750miles) away to find data. At longer ranges this become problematic. Take for example northern Canada or the arctic where they must extrapolate over vast distances.
HOMOGENIZATION
This adjustment that blends data for all stations was designed to detect previously undisclosed inhomogeneities (station moves or siting changes) and adjust for urbanization. It may help detect siting discontinuities but is not an adequate substitute for UHI adjustment. The rural stations if properly sited and the Climate Reference network of stations should be reference to adjust the urban stations.
Instead through homogenization the rural areas are contaminated by urban stations, Dr. Edward Long from NASA examined a set of rural and urban stations in the lower 48 states both raw and adjusted. After adjustment, the rural warming rates increased 5 fold while urban warming rates were only slightly reduced. This augmented not eliminated UHI contamination.
The other data set that presents a challenge for a precise assessment of global average surface temperature (GAST) is world’s oceans, which cover 71% of the globe.
Major questions persist about how much and when to adjust for changing coverage and measurement techniques from buckets to ship intake, to moored and drifting buoys, satellite skin temperature measurements and now ARGO diving buoys.
ARGO network of 3341 diving buoys and floats introduced in 2003 (now 4000) were designed to improve the assessment going forward.
But despite the fact this technology was designed specifically for the purpose, the early ARGO buoys, disappointed by showing no confirmation of an upward trend. So the data from the buoys was “adjusted.” John Bates, data quality officer with NOAA admitted “They had good data from buoys...and “corrected” it by using the bad data from ships. You never change good data to agree with bad, but that’s what they did - so as to make it look as if the sea was warmer.”
That was just the latest example of data manipulation. Initially, this global data had a cyclical pattern similar to previously reported Northern Hemisphere data (high in the 1930s and 40s, low in the 70s). Then, as time progressed, the previous officially reported GAST data history was modified, removing the cycle and creating a more and more strongly upward sloping linear trend in each freshly reported historical data set. Peer reviewed, published and readily reproducible research has shown that: “The conclusive findings were that the three GAST data sets are not a valid representation of reality.”
In fact, the magnitude of their historical data adjustments, which removed their cyclical temperature patterns are completely inconsistent with published and credible U.S. and other temperature data. Thus, despite current assertions of record-setting warming, it is impossible to conclude from the NOAA and NASA data sets that recent years have been the “warmest ever.”
Note that 2012 ranked as 5th coldest in Fairbanks and coldest in McGrath. It had very cold water off the coast then and modern day record Bering Sea ice resulted.
Early departure of Bering Sea ice in 2019 due to strong NPAC storms lead to early ocean warming and record early summer high temperatures.
When NCEI has the monthly data, we will repost with graph of Januarys statewide.
Often cold Alaska retards cold in the lower 48.???
See earlier story when heat followed early Bering Sea ice loss in last years’ warmer winter here.
Meanwhile in Saudi Arabia, record cold and snow.
And at Kazakh, capital of Kazakhstan, snow brought deep cold. The cold developed over the deep snowcover, the +NAO had Atlantic flow into Europe and Russia, deflecting the cold air south into Kazakhstan and the Middle East.
With the cold continuing in the arctic and Alaska, ice has increased for the second year in a row.
Puget Sound Islanders Look for Hope, Climate Solutions
Islanders Look for Hope, Climate Solutions, in the January 22 Vashon-Maury Island Beachcomber link
RESPONSES:
Washington’s Vashon Island residents may be concerned when they hear reports that the world isn’t doing enough on climate change, but they would be very relieved to be informed that the NOAA National Climatic Data Center Climate at a Glance website reports the official Washington climate change data indicating that:
Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 0.2 degrees F per decade during the last 30 winters from 1990 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.
Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 0.6 degrees F per decade during the last 20 winters from 2000 to 2019, even as Washington-s atmospheric CO2 concentrations have continued to increase.
Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 1.5 degrees F per decade during the last 10 winters from 2010 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.
Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 13.2 degrees F per decade during the last 5 winters from 2015 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase!
Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at ever-increasing downward rates during the last 30 winters from 1990 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.
Dr. Gordon Fulks replied to the newspaper:
The best way to deal with climate hysteria is to learn something about our climate. As soon as you do, you realize that it is far more complex than those in the media maintain. It is certainly not a one parameter (carbon dioxide fits all) situation.
In fact carbon dioxide appears to have its greatest negative effect when the level drops below 200 ppmv, and plant life starts to die. That turns high deserts, like the Gobi in China, into wastelands with vast dust storms that sweep the planet for tens of thousands of years. That turns continental glaciers that cover large portions of the Northern Hemisphere darker, such that they begin to melt. During a Milankovitch Cycle ‘Great Summer’ lasting 5,000 years, the Earth is then able to break free of a 100,000 year Ice Age and bring us into an Interglacial Period like the present Holocene Climate Optimum. Those generally last about 10,000 years, before another ‘Great Winter’ drags us into the next Ice Age. When that happens, Vashon Island will be again covered by an ice sheet thousands of feet thick!
But recent legitimate science (not the Global Warming Bad Science) suggests that we may have 50,000 years before the next Ice Age, due to both decreasing Obliquity (decreasing tilt of the Earth’s axis) and decreasing orbital Eccentricity. The giant planet Jupiter, more than two and a half times as massive as all of the other planets in the solar system combined, causes these Milankovitch Cycles.
As to Greenland melting and other climate hysteria, let me point out that Greenland has always lost ice into the North Atlantic as RMS Titanic discovered a hundred years ago. And Greenland just set an all-time record low of -87 F at the National Science Foundation site of Summit Camp.
If you want more authoritative information on our climate, you should visit such sites as wuwt, icecap.us, and co2coalition. I am one of the unpaid Directors of the CO2 Coalition, along with Princeton Professor of Physics Will Happer, Greenpeace co-founder Dr. Patrick Moore, and Dr. Harrison Schmitt, the only scientist to have walked on the moon.
MUST READ BACKGROUND ON THE PLANS AFOOT here written by Zuzana Janosova Den Boer, who experienced Communist rule in Czechoslovakia before coming to Canada.
And this post from Greenpeace co-founder, Dr. Patrick Moore.
--------------------------
US Chamber of Commerce projects $130 a barrel oil prices from a Fracking Ban and implementation of any of the Democrat plans they clam will put an end to heat waves, floods, droughts, hurricanes, tornados, wildfires, sea level rises.
The U.S. Chamber of Commerce warned about the implications of banning fracking ahead of Thursday night’s Democratic debate.
If Bernie Sanders or Elizabeth Warren fulfill their pledges to ban fracking upon becoming president in 2021, it would cause natural gas prices to rise by 324%, causing household energy bills to “more than quadruple,” the Chamber projected in a new report.
By 2025, drivers would pay twice as much at the pump for gasoline as oil prices spike to $130 per barrel.
A fracking ban would eliminate 19 million jobs and reduce GDP by $7.1 trillion by 2025. Most of the job losses would occur in Texas, home to the oil-and-gas rich Permian Basin, where more than 3 million jobs would be affected.
Oil and gas production is also a significant contributor to federal, state and local revenue, the Chamber notes. Tax revenue at the local, state, and federal levels would decline by nearly a combined $1.9 trillion if a Democrat bans fracking.
Not great for emissions either: Banning fracking would have a questionable impact on emissions.
In the near term, coal use might increase to offset the loss of electricity from natural gas plants. That could increase emissions overall, even if fracking limits lowered emissions of methane and raised the price of oil.
It’s also challenging to replace gas use from buildings and in the manufacturing sector immediately, so that would likely require importing more fossil fuels.
The economy in every country that has moved down an extreme green path have seen skyrocketing energy costs - 3 times our levels.
Renewables are unreliable as the wind doesn’t always blow nor the sun shine. And don’t believe the claims millions of green jobs would result. In Spain, every green job created cost Spain $774,000 in subsidies and resulted in a loss of 2.2 real jobs. Only 1 in 10 green jobs were permanent. Industry left and in Spain unemployment rose to 27.5%.
Tom Steyer is a hypocrite having made his billions from trading coal. He is pushing the globalist agenda and pretending he is just a simple family man and he and his wife started a fund to help people and has come to believe climate change is the greatest threat the world faces. He says his green plan would create millions of great jobs. That has not worked out anywhere it has been tried and the poor have suffered the most.
Many households in the countries that have gone green are said to be in “energy poverty” (25% UK, 15% Germany). The elderly are said in winter to be forced to “choose between heating and eating”. Extreme cold already kills 20 times more than heat according to a study of 74 million deaths in 13 countries.
Politicians in the northeast states are bragging that they stopped the natural gas pipeline, shut down nuclear and coal plants and blocked the northern Pass which would have delivered low cost hydropower from Canada. In Concord, they are now scurrying to try and explain why electricity prices are 50 to 60% higher than the national average here and are speculating they have not moved fast enough with wind and solar. Several states have even established zero carbon emissions. This will lead to soaring energy prices and life-threatening blackouts. For a family of 4 in a modest house with 3 cars, the energy costs could increase over $10,000/year. And by the way like in Europe where this plan was enacted, many will lose their jobs.
Prosperity always delivers a better environment than poverty.
“If you don’t know where you are going, you might end up somewhere else” Yogi Berra
------------
Dems seek to squash suburban, single-family house zoning, calling it racist, bad for environment
Virginia House Del. Ibraheem Samirah introduced a bill that would override local zoning officials to permit multi-family housing in every neighborhood, changing the character of quiet suburbs. Oregon passed a similar bill, following moves by cities such as Minneapolis; Austin, Texas; and Seattle. Proponents say urban lifestyles are better for the environment and that suburbs are bastions of racial segregation.
Democrats in Virginia may override local zoning to bring high-density housing, including public housing, to every neighborhood statewide - whether residents want it or not.
The measure could quickly transform the suburban lifestyle enjoyed by millions, permitting duplexes to be built on suburban lots in neighborhoods previously consisting of quiet streets and open green spaces. Proponents of “Upzoning” say the changes are necessary because suburbs are bastions of segregation and elitism, as well as bad for the environment.
The move, which aims to provide “affordable housing,” might be fiercely opposed by local officials throughout the state, who have deliberately created and preserved neighborhoods with particular character - some dense and walkable, others semi-rural and private to accommodate people’s various preferences.
But Democrats tout a state-level law’s ability to replace “not in my backyard” with “yes, in your backyard.”
House Delegate Ibraheem Samirah, a Democrat, introduced six housing measures Dec. 19, coinciding with Democrats’ takeover of the state legislature in November.
“Single-family housing zones would become two-zoned,” Samirah told the Daily Caller News Foundation. “Areas that would be impacted most would be the suburbs that have not done their part in helping out.”
“The real issues are the areas in between very dense areas which are single-family zoned. Those are the areas that the state is having significant trouble dealing with. They’re living in a bubble,” he said.
He said suburbs were “mostly white and wealthy” and that their local officials - who have historically been in charge of zoning - were ignoring the desires of poor people, who did not have time to lobby them to increase suburban density.
In response to a question about whether people who bought homes in spacious suburbs have valid reasons, not based on discrimination, for preferring to live that way - including a love for nature and desire to preserve woods and streams ‘ he said: “Caring about nature is very important, but the more dense a neighborhood is, the more energy efficient it is.”
He said if local officials seek to change requirements like setbacks to make it impossible to build dense housing in areas zoned to preserve a nature feel, “if they make setbacks to block duplexes, there’d have to be a lawsuit to resolve whether those zoning provisions were necessary.”
He wrote on Facebook, “Because middle housing is what’s most affordable for low-income people and people of color, banning that housing in well-off neighborhoods chalks up to modern-day redlining, locking folks out of areas with better access to schools, jobs, transit, and other services and amenities.”
“I will certainly get pushback for this. Some will call it ‘state overreach.’ Some will express anxiety about neighborhood change. Some may even say that the supply issue doesn’t exist. But the research is clear: zoning is a barrier to more housing and integrated communities,” he continued.
He tweeted Sunday that that would include public housing. “Important Q about new social/public housing programs: where are we going to put the units? Under current zoning, new low-income housing is relegated to underinvested neighborhoods, concentrating poverty more. Ending exclusionary zoning has to be part of broader housing reform,” he said.
Tim Hannigan, chairman of the Fairfax County Republican Committee - in one of the areas Samirah represents -s aid that urban Democrats were waging war on the suburbs.
Residential life, because of the urbanization that would develop,” he told the DCNF. “So much of the American dream is built upon this idea of finding a nice quiet place to raise your family, and that is under assault.”
“This is a power-grab to take away the ability of local communities to establish their own zoning practices… literally trying to change the character of our communities,” he said.
He said suburbs were not equipped to handle the increased traffic, and “inevitably it will just push people to places where they feel they’ll get away from that, they may move to West Virginia to get their little plot of land.”
Minneapolis became the first city to eliminated single family zoning in December 2018, after a push by progressive advocacy groups promoting “equity.” Austin, Texas, and Seattle soon followed suit.
But those cities were amending zoning codes that have always been the domain of local governments. Oregon passed state legislation blocking local governments[ single-family zoning in July, CityLab reported.
It quoted Alex Baca, a Washington, D.C., urbanist with the site Greater Greater Washington, saying that single-family zoning is a tool for wealthy whites to maintain segregated neighborhoods and that the abolition of low-density neighborhoods is necessary for equity.
CityLab acknowledged that “residents might reasonably desire to keep the neighborhoods they love the way they are,” but said that implementing the law at the state level makes sure that those concerns can be more easily ignored.
“By preempting the ability of local governments to set their own restrictive zoning policies, the state policy would circumnavigate the complaints of local NIMBY homeowners who want to block denser housing,” it wrote.
While he implied that suburbs are prejudiced, Samirah himself has a history of anti-Semitic comments, including saying sending money to Israel is worse than funding the Klu Klux Klan.
“I am so sorry that my ill-chosen words added to the pain of the Jewish community, and I seek your understanding and compassion as I prove to you our common humanity,” he said in February.
He interrupted a speech in July by President Donald Trump in Jamestown, Virginia, and said, “You can’t send us back! Virginia is our home.”
His father is Jordanian refugee Sabri Samirah, who authorities banned from the U.S. for a decade after the Sept. 11, 2001 attacks, in part because of his membership in the Muslim Brotherhood, the Chicago Tribune reported in 2014.
-----------
This is just part of a master plan slowly being put in place across the world. Remember this recent story about Ireland.
The Global Wind Energy Council recently released its latest report, excitedly boasting that ‘the proliferation of wind energy into the global power market continues at a furious pace, after it was revealed that more than 54 gigawatts of clean renewable wind power was installed across the global market last year’.
You may have got the impression from announcements like that, and from the obligatory pictures of wind turbines in any BBC story or airport advert about energy, that wind power is making a big contribution to world energy today. You would be wrong. Its contribution is still, after decades - nay centuries - of development, trivial to the point of irrelevance.
Here’s a quiz; no conferring. To the nearest whole number, what percentage of the world’s energy consumption was supplied by wind power in 2014, the last year for which there are reliable figures? Was it 20 per cent, 10 per cent or 5 per cent? None of the above: it was 0 per cent. That is to say, to the nearest whole number, there is still no wind power on Earth.
Matt Ridley and climate change campaigner Leo Murray debate the future of wind power:
Even put together, wind and photovoltaic solar are supplying less than 1 per cent of global energy demand. From the International Energy Agency’s 2016 Key Renewables Trends, we can see that wind provided 0.46 per cent of global energy consumption in 2014, and solar and tide combined provided 0.35 per cent. Remember this is total energy, not just electricity, which is less than a fifth of all final energy, the rest being the solid, gaseous, and liquid fuels that do the heavy lifting for heat, transport and industry.
Such numbers are not hard to find, but they don’t figure prominently in reports on energy derived from the unreliables lobby (solar and wind). Their trick is to hide behind the statement that close to 14 per cent of the world’s energy is renewable, with the implication that this is wind and solar. In fact the vast majority - three quarters - is biomass (mainly wood), and a very large part of that is ‘traditional biomass’ sticks and logs and dung burned by the poor in their homes to cook with. Those people need that energy, but they pay a big price in health problems caused by smoke inhalation.
Even in rich countries playing with subsidised wind and solar, a huge slug of their renewable energy comes from wood and hydro, the reliable renewables. Meanwhile, world energy demand has been growing at about 2 per cent a year for nearly 40 years. Between 2013 and 2014, again using International Energy Agency data, it grew by just under 2,000 terawatt-hours.
If wind turbines were to supply all of that growth but no more, how many would need to be built each year? The answer is nearly 350,000, since a two-megawatt turbine can produce about 0.005 terawatt-hours per annum. That’s one-and[a-half times as many as have been built in the world since governments started pouring consumer funds into this so-called industry in the early 2000s.
At a density of, very roughly, 50 acres per megawatt, typical for wind farms, that many turbines would require a land area greater than the British Isles, including Ireland. Every year. If we kept this up for 50 years, we would have covered every square mile of a land area the size of Russia with wind farms. Remember, this would be just to fulfil the new demand for energy, not to displace the vast existing supply of energy from fossil fuels, which currently supply 80 per cent of global energy needs.
Do not take refuge in the idea that wind turbines could become more efficient. There is a limit to how much energy you can extract from a moving fluid, the Betz limit, and wind turbines are already close to it. Their effectiveness (the load factor, to use the engineering term) is determined by the wind that is available, and that varies at its own sweet will from second to second, day to day, year to year.
As machines, wind turbines are pretty good already; the problem is the wind resource itself, and we cannot change that. It’s a fluctuating stream of low-density energy. Mankind stopped using it for mission-critical transport and mechanical power long ago, for sound reasons. It’s just not very good.
As for resource consumption and environmental impacts, the direct effects of wind turbines - killing birds and bats, sinking concrete foundations deep into wild lands - is bad enough. But out of sight and out of mind is the dirty pollution generated in Inner Mongolia by the mining of rare-earth metals for the magnets in the turbines. This generates toxic and radioactive waste on an epic scale, which is why the phrase ‘clean energy’ is such a sick joke and ministers should be ashamed every time it passes their lips.
It gets worse. Wind turbines, apart from the fibreglass blades, are made mostly of steel, with concrete bases. They need about 200 times as much material per unit of capacity as a modern combined cycle gas turbine. Steel is made with coal, not just to provide the heat for smelting ore, but to supply the carbon in the alloy. Cement is also often made using coal. The machinery of ‘clean’ renewables is the output of the fossil fuel economy, and largely the coal economy.
A two-megawatt wind turbine weighs about 250 tonnes, including the tower, nacelle, rotor and blades. Globally, it takes about half a tonne of coal to make a tonne of steel. Add another 25 tonnes of coal for making the cement and you’re talking 150 tonnes of coal per turbine. Now if we are to build 350,000 wind turbines a year (or a smaller number of bigger ones), just to keep up with increasing energy demand, that will require 50 million tonnes of coal a year. That’s about half the EU’s hard coal-mining output.
Forgive me if you have heard this before, but I have a commercial interest in coal. Now it appears that the black stuff also gives me a commercial interest in ‘clean’, green wind power.
The point of running through these numbers is to demonstrate that it is utterly futile, on a priori grounds, even to think that wind power can make any significant contribution to world energy supply, let alone to emissions reductions, without ruining the planet. As the late David MacKay pointed out years back, the arithmetic is against such unreliable renewables.
The truth is, if you want to power civilisation with fewer greenhouse gas emissions, then you should focus on shifting power generation, heat and transport to natural gas, the economically recoverable reserves of which - thanks to horizontal drilling and hydraulic fracturing - are much more abundant than we dreamed they ever could be. It is also the lowest-emitting of the fossil fuels, so the emissions intensity of our wealth creation can actually fall while our wealth continues to increase. Good.
And let’s put some of that burgeoning wealth in nuclear, fission and fusion, so that it can take over from gas in the second half of this century. That is an engineerable, clean future. Everything else is a political displacement activity, one that is actually counterproductive as a climate policy and, worst of all, shamefully robs the poor to make the rich even richer.
Spectator.co.uk/podcast
Matt Ridley discusses wind power