What's New and Cool
Feb 20, 2020
A Climate Modeller Spills the Beans

Tony Thomas

Update: See also how a German Professors says NASA Has Fiddled Climate Data On ‘Unbelievable’ Scale here.

-------------

image

There’s a top-level oceanographer and meteorologist who is prepared to cry “Nonsense!” on the “global warming crisis” evident to climate modellers but not in the real world. He’s as well or better qualified than the modellers he criticises - the ones whose Year 2100 forebodings of 4 degC warming have set the world to spending $US1.5 trillion a year to combat CO2 emissions.

The iconoclast is Dr. Mototaka Nakamura. In June he put out a small book in Japanese on “the sorry state of climate science”. It’s titled Confessions of a climate scientist: the global warming hypothesis is an unproven hypothesis, and he is very much qualified to take a stand. From 1990 to 2014 he worked on cloud dynamics and forces mixing atmospheric and ocean flows on medium to planetary scales. His bases were MIT (for a Doctor of Science in meteorology), Georgia Institute of Technology, Goddard Space Flight Centre, Jet Propulsion Laboratory, Duke and Hawaii Universities and the Japan Agency for Marine-Earth Science and Technology. He’s published about 20 climate papers on fluid dynamics.

Today’s vast panoply of “global warming science” is like an upside down pyramid built on the work of a few score of serious climate modellers. They claim to have demonstrated human-derived CO2 emissions as the cause of recent global warming and project that warming forward. Every orthodox climate researcher takes such output from the modellers’ black boxes as a given.

A fine example is from the Australian Academy of Science’s explanatory booklet of 2015. It claims, absurdly, that the models’ outputs are “compelling evidence” for human-caused warming.[ii] Specifically, it refers to model runs with and without human emissions and finds the “with” variety better matches the 150-year temperature record (which itself is a highly dubious construct). Thus satisfied, the Academy then propagates to the public and politicians the models’ forecasts for disastrous warming this century.

Now for Dr Nakamura’s expert demolition of the modelling. There was no English edition of his book in June and only a few bits were translated and circulated. But Dr Nakamura last week offered via a free Kindle version his own version in English. It’s not a translation but a fresh essay leading back to his original conclusions.

The temperature forecasting models trying to deal with the intractable complexities of the climate are no better than “toys” or “Mickey Mouse mockeries” of the real world, he says. This is not actually a radical idea. The IPCC in its third report (2001) conceded (emphasis added),

In climate research and modelling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible. (Chapter 14, Section 14.2.2.2. )]

Somehow that official warning was deep-sixed by the alarmists. Now Nakamura has found it again, further accusing the orthodox scientists of “data falsification” by adjusting previous temperature data to increase apparent warming “The global surface mean temperature-change data no longer have any scientific value and are nothing except a propaganda tool to the public,” he writes.

The climate models are useful tools for academic studies, he says. However, “the models just become useless pieces of junk or worse (worse in a sense that they can produce gravely misleading output) when they are used for climate forecasting.” The reason:

These models completely lack some critically important climate processes and feedbacks, and represent some other critically important climate processes and feedbacks in grossly distorted manners to the extent that makes these models totally useless for any meaningful climate prediction.

I myself used to use climate simulation models for scientific studies, not for predictions, and learned about their problems and limitations in the process.

Nakamura and colleagues even tried to patch up some of the models’ crudeness

...so I know the workings of these models very well… For better or worse I have more or less lost interest in the climate science and am not thrilled to spend so much of my time and energy in this kind of writing beyond the point that satisfies my own sense of obligation to the US and Japanese taxpayers who financially supported my higher education and spontaneous and free research activity. So please expect this to be the only writing of this sort coming from me.

I am confident that some honest and courageous, true climate scientists will continue to publicly point out the fraudulent claims made by the mainstream climate science community in English. I regret to say this but I am also confident that docile and/or incompetent Japanese climate researchers will remain silent until the ‘mainstream climate science community’ changes its tone, if ever.

He projects warming from CO2 doubling, “according to the true experts”, to be only 0.5degC. He says he doesn’t dispute the possibility of either catastrophic warming or severe glaciation since the climate system’s myriad non-linear processes swamp “the toys” used for climate predictions. Climate forecasting is simply impossible, if only because future changes in solar energy output are unknowable.  As to the impacts of human-caused CO2, they can’t be judged “with the knowledge and technology we currently possess.”

Other gross model simplifications include

# Ignorance about large and small-scale ocean dynamics

# A complete lack of meaningful representations of aerosol changes that generate clouds.

# Lack of understanding of drivers of ice-albedo (reflectivity) feedbacks: “Without a reasonably accurate representation, it is impossible to make any meaningful predictions of climate variations and changes in the middle and high latitudes and thus the entire planet.”

# Inability to deal with water vapor elements

# Arbitrary “tunings” (fudges) of key parameters that are not understood

Concerning CO2 changes he says,

I want to point out a simple fact that it is impossible to correctly predict even the sense or direction of a change of a system when the prediction tool lacks and/or grossly distorts important non-linear processes, feedbacks in particular, that are present in the actual system…

...The real or realistically-simulated climate system is far more complex than an absurdly simple system simulated by the toys that have been used for climate predictions to date, and will be insurmountably difficult for those naive climate researchers who have zero or very limited understanding of geophysical fluid dynamics. I understand geophysical fluid dynamics just a little, but enough to realize that the dynamics of the atmosphere and oceans are absolutely critical facets of the climate system if one hopes to ever make any meaningful prediction of climate variation.

Solar input, absurdly, is modelled as a “never changing quantity”. He says, “It has only been several decades since we acquired an ability to accurately monitor the incoming solar energy. In these several decades only, it has varied by one to two watts per square metre. Is it reasonable to assume that it will not vary any more than that in the next hundred years or longer for forecasting purposes? I would say, No.”

Good modelling of oceans is crucial, as the slow ocean currents are transporting vast amounts of heat around the globe, making the minor atmospheric heat storage changes almost irrelevant. For example, the Gulf Stream has kept western Eurasia warm for centuries. On time scales of more than a few years, it plays a far more important role on climate than atmospheric changes. “It is absolutely vital for any meaningful climate prediction to be made with a reasonably accurate representation of the state and actions of the oceans.” In real oceans rather than modelled ones, just like in the atmosphere, the smaller-scale flows often tend to counteract the effects of the larger-scale flows. Nakamura spent hundreds of hours vainly trying to remedy the flaws he observed, concluding that the models “result in a grotesque distortion of the mixing and transport of momentum, heat and salt, thereby making the behaviour of the climate simulation models utterly unrealistic...”

Proper ocean modelling would require a tenfold improvement in spatial resolution and a vast increase in computing power, probably requiring quantum computers. If or when quantum computers can reproduce the small-scale interactions, the researchers will remain out of their depth because of their traditional simplifying of conditions.

Key model elements are replete with “tunings” i.e. fudges. Nakamura explains how that trick works

The models are ‘tuned’ by tinkering around with values of various parameters until the best compromise is obtained. I used to do it myself. It is a necessary and unavoidable procedure and not a problem so long as the user is aware of its ramifications and is honest about it. But it is a serious and fatal flaw if it is used for climate forecasting/prediction purposes.

One set of fudges involves clouds.

Ad hoc representation of clouds may be the greatest source of uncertainty in climate prediction. A profound fact is that only a very small change, so small that it cannot be measured accurately...in the global cloud characteristics can completely offset the warming effect of the doubled atmospheric CO2.

Two such characteristics are an increase in cloud area and a decrease in the average size of cloud particles.

Accurate simulation of cloud is simply impossible in climate models since it requires calculations of processes at scales smaller than 1mm. Instead, the modellers put in their own cloud parameters. Anyone studying real cloud formation and then the treatment in climate models would be “flabbergasted” by the perfunctory treatment of clouds in the models.

Nakamura describes as “moronic” the claims that “tuned” ocean models are good enough for climate predictions. That’s because, in tuning some parameters, other aspects of the model have to become extremely distorted. He says a large part of the forecast global warming is attributed to water vapor changes, not CO2 changes. “But the fact is this: all climate simulation models perform poorly in reproducing the atmospheric water vapor and its radiative forcing observed in the current climate… They have only a few parameters that can be used to ‘tune’ the performance of the models and (are) utterly unrealistic.” Positive water vapor feedbacks from CO2 increases are artificially enforced by the modelers. They neglect other reverse feedbacks in the real world, and hence they exaggerate forecast warming.

The supposed measuring of global average temperatures from 1890 has been based on thermometer readouts barely covering 5 percent of the globe until the satellite era began 40-50 years ago. “We do not know how global climate has changed in the past century, all we know is some limited regional climate changes, such as in Europe, North America and parts of Asia.” This makes meaningless the Paris targets of 1.5degC or 2degC above pre-industrial levels.

image
Enlarged

He is contemptuous of claims about models being “validated”, saying the modellers are merely “trying to construct narratives that justify the use of these models for climate predictions.” And he concludes,

The take-home message is (that) all climate simulation models, even those with the best parametric representation scheme for convective motions and clouds, suffer from a very large degree of arbitrariness in the representation of processes that determine the atmospheric water vapor and cloud fields. Since the climate models are tuned arbitrarily ...there is no reason to trust their predictions/forecasts.

With values of parameters that are supposed to represent many complex processes being held constant, many nonlinear processes in the real climate system are absent or grossly distorted in the models. It is a delusion to believe that simulation models that lack important nonlinear processes in the real climate system can predict (even) the sense or direction of the climate change correctly.

I was distracted from his message because the mix of Japanese and English scripts in the book kept crashing my Kindle software. Still, I persevered. I recommend you do too. There’s at least $US30 trillion ($US30,000, 000,000,000) hanging on this bunfight.


Tony Thomas’s new book, The West: An insider’s tale - A romping reporter in Perth’s innocent ‘60s is available from Boffins Books, Perth, the Royal WA Historical Society (Nedlands) and online here

They include (to give you the flavor)

# “Destabilisation of thermohaline circulation by atmospheric eddy transports”

#"Effects of the ice-albedo [reflectivity] and runoff feedbacks on the thermohaline circulation”

# “Diagnoses of an eddy-resolving Atlantic Ocean model simulation in the vicinity of the Gulf Stream”

# “A simulation study of the 2003 heat wave in Europe”

# “Impacts of SST [sea surface temperature] anomalies in the Agulhas Current System on the climate variations in the southern Africa and its vicinity.”

# “Greenland sea surface temperature changes and accompanying changes in the north hemispheric climate.”

[ii] “Climate models allow us 
to understand the causes of past climate changes, and to project climate change into the future. Together with physical principles and knowledge of past variations, models provide compelling evidence that recent changes are due to increased greenhouse gas concentrations in the atmosphere ... Using climate models, it is possible to separate the effects of the natural and human-induced influences on climate. Models can successfully reproduce the observed warming over the last 150 years when both natural and human influences are included, but not when natural influences act alone.: A footnote directs to a study by 15 modellers cited in the 2015 IPCC report.

Feb 17, 2020
Global Average Surface Temperature Measurement Uncertainties make NOAA/NASA Claims Absurd

Joseph D’Aleo, CCM

By Joseph D’Aleo

The most significant uncertainties that must be dealt with to properly analyze temperature trends are detailed below. We touch on alarmist claims that are seen in greater detail in the sections of this document summarized in the January 21st blog post under Joe’s Blog.

-------------------

NOAA and NASA can be counted on virtually every month or year’s end to religiously and confidently proclaim that the latest global average surface temperature (GAST) is among the warmest on record. Back in the 1970s when an assessment of a global temperature was first attempted, the scientists recognized that even land-only surface temperature data was a significant challenge given that most of the reliable data was limited to populated areas of the U.S, Europe and eastern China with just spotty often intermittent data from vast land areas elsewhere.

Temperatures over oceans, which covered 71% of the globe, were measured along shipping routes mainly in the Northern Hemisphere erratically and with varying measurement methods.  Despite these shortcomings and the fact that absolutely no credible grid level temperature data existed over the period from 1880 to 2000 in the Southern Hemisphere’s oceans (covering 80.9% of the Southern Hemisphere), global average surface temperature data estimation and publication by NOAA and NASA began in the early 1990s.

In this era of ever-improving technology and data systems, one would assume that measurements would be constantly improving. This is not the case with the global observing network. The world’s surface observing network had reached its golden era in the 1960s to 1980s, with more than 6,000 stations providing valuable climate information. 

DATA DROPOUT

The number of weather stations providing data to GHCN plunged in 1990 and again in 2005 (as stations in the oversampled lower 48 states were thinned out). The sample size has fallen by over 75% from its peak in the early 1970s, and is now smaller than at any time since 1919. The collapse in sample size has increased the relative fraction of data coming from airports to 49 percent (up from about 30 percent in the 1970s). It has also reduced the average latitude of source data and removed relatively more high-altitude monitoring sites (McKitrick 2010).

image
Enlarged

We could show many regional or country examples but here is one, Canada. NOAA GHCN used only 35 of the 600 Canadian stations in 2009. Verity Jones plotted the stations from the full network rural, semi-rural and urban for Canada and the northern United States both in 1975 and again in 2009. She also marked with diamonds the stations used in the given year. Notice the good coverage in 1975 and very poor, virtually all in the south in 2009. Notice the lack of station coverage in the higher latitude Canadian region and arctic in 2009.

image
Enlarged
Canadian stations used in annual analyses in 1975 and 2009 (source: Verity Jones from GHCN).

Just one thermometer remains in the database for Canada for everything north of the 65th parallel. That station is Eureka, which has been described as “The Garden Spot of the Arctic” thanks to the flora and fauna abundant around the Eureka area, more so than anywhere else in the High Arctic. Winters are frigid but summers are slightly warmer than at other places in the Canadian Arctic.

Environment Canada reported in the National Post, that there are 1,400 stations in Canada with 100 north of the Arctic Circle, where GHCN includes just one.

MISSING MONTHLY DATA

After the 1980s, the network suffered not only from a loss of stations but an increase in missing monthly data. To fill in these large holes, data were extrapolated from greater distances away.

Forty percent of GHCN v2 stations have at least one missing month, It reached 90% in Africa and South America.

image
Analysis and graph: Verity Jones Enlarged

BAD SITING

According to the World Meteorological Organization’s own criteria, followed by the NOAA’s National Weather Service, temperature sensors should be located on the instrument tower at 1.5 m (5 feet) above the surface of the ground. The tower should be on flat, horizontal ground surrounded by a clear surface, over grass or low vegetation kept less than 4 inches high. The tower should be at least 100 m (110 yards) from tall trees, or artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots.

Very few stations meet these criteria. The modernization of weather stations in the United States replaced many human observers with instruments that initially had warm biases (HO-83) and later cold biases (MMTS) or were designed for aviation and were not suitable for precise climate trend detection [Automates Surface Observing Systems (ASOS) and the Automated Weather Observing System (AWOS). Note the specifications required a RMSE of 0.8F and max error of 1.9F. ASOS was designed to supply key information for aviation such as ceiling visibility, wind, indications of thunder and icing. They were not designed for assessing climate.

image
Enlarged

Also, the new instrumentation was increasingly installed on unsuitable sites that did not meet the WMO’s criteria. During recent decades there has been a migration away from old instruments read by trained observers. These instruments were generally in shelters that were properly located over grassy surfaces and away from obstacles to ventilation and heat sources.

Today we have many more automated sensors (The MMTS) located on poles cabled to the electronic display in the observer’s home or office or at airports near the runway where the primary mission is aviation safety.

The installers of the MMTS instruments were often equipped with nothing more than a shovel. They were on a tight schedule and with little budget. They often encountered paved driveways or roads between the old sites and the buildings. They were in many cases forced to settle for installing the instruments close to the buildings, violating the government specifications in this or other ways.

Pielke and Davey (2005) found a majority of stations, including climate stations in eastern Colorado, did not meet WMO requirements for proper siting. They extensively documented poor siting and land-use change issues in numerous peer-reviewed papers, many summarized in the landmark paper “Unresolved issues with the assessment of multi-decadal global land surface temperature trends (2007).

In a volunteer survey project, Anthony Watts and his more than 650 volunteers at www.surfacestations.org found that over 900 of the first 1,067 stations surveyed in the 1,221 station U.S. climate network did not come close to the Climate Reference Network (CRN) criteria. 90% were sited in ways that result in errors exceeding 1C according to the CRN handbook.

Only about 3% met the ideal specification for siting. They found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. They found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas. In fact, they found that 90% of the stations fail to meet the National Weather Service’s own siting requirements that stations must be 30 m (about 100 feet) or more away from an artificial heating or reflecting source.

The average warm bias for inappropriately-sited stations exceeded 1C using the National Weather Service’s own criteria, with which the vast majority of stations did not comply.

image
Enlarged

In 2008, Joe D’Aleo asked NOAA’s Tom Karl about the problems with siting and about the plans for a higher quality Climate Reference Network (CRN at that time called NERON). Karl said he had presented a case for a more complete CRN network to NOAA but NOAA said it was unnecessary because they had invested in the more accurate satellite monitoring. The Climate Reference Network was capped at 114 stations and did not provide meaningful trend assessment for about 10 years. Here is the latest monthly time series - now 15 years.

image
Enlarged

BTW, in monthly press releases no satellite measurements are ever mentioned, although NOAA claimed that was the future of observations.

URBANIZATION/LAND USE CHANGES

The biggest issue though to accurate measurement is urbanization. Bad siting usually enhances the warming effect. Weather data from cities as collected by meteorological stations are indisputably contaminated by urban heat-island bias and land-use changes. This contamination has to be removed or adjusted for in order to accurately identify true background climatic changes or trends.

In cities, vertical walls, steel and concrete absorb the sun’s heat and are slow to cool at night. In surrounding suburban areas (often where airports are located), commercialization and increased population densities increase the temperatures at night relative to the surrounding rural areas. More and more of the world is urbanized (population increased from 1.5 B in 1900 to over 7.1 billion today.

image
Enlarged

The EPA depicts the typical temperature distribution from city center to rural, similar to the observed minimum temperature analysis surrounding London in mid May (about a 10F difference is shown).

image
Enlarged

Oke (1973) found a village with a population of 10 has a warm bias of 0.73C, a village with 100 has a warm bias of 1.46 C, a town with a population of 1000 people has a warm bias of 2.2 C, and a large city with a million people has a warm bias of 4.4C.

Zhou et al (2005) have shown global data bases (for China) not properly adjusted for urbanization. Block (2004) showed the same problem exists in central Europe. Hinkel et al (2003) showed even the village of Barrow, Alaska with a population of 4600 has shown a warming of 3.4F in winter over surrounding rural areas, These are but a handful of the dozens of studies documenting the UHI contamination.

Most confirm the warming is predominantly at night. During the day when the atmosphere is well mixed, the urban and rural areas are much the same. This analysis by in Critchfield (1983) for urban Vienna and suburban Hohe Warte shows the temperature traces for February and July.

image
Enlarged

Tom Karl whose paper in 1988 defined the UHI adjustment for the first version of USHCN (which was removed in version 2) wrote with Kukla and Gavin in a 1986 paper on Urban Warming:

“MeteoSecular trends of surface air temperature computed predominantly from urban station data are likely to have a serious warm bias… The average difference between trends (urban siting vs. rural) amounts to an annual warming rate of 0.34C/decade… The reason why the warming rate is considerably higher [may be] that the rate may have increased after the 1950s, commensurate with the large recent growth in and around airports.... Our results and those of others show that the urban growth inhomogeneity is serious and must be taken into account when assessing the reliability of temperature records.”

Inexplicably, the UHI adjustment Karl argued for was removed in USHCNv2.

This concerned some

image
Enlarged

Doug Hoyt, once chief scientist at Raytheon wrote: “It is not out of the realm of possibility that most of the twentieth century warming was urban heat islands.’

It continues to show up in the data. The nighttime temperatures the last 17 years (NASA AIRS) have warmed in the United States while daytime changes, the best measure of any warming have been very small.

image

As an example of before and after, the average annual temperatures for the state of Maine downloaded in 2010 before the change (-0.01F/decade) and after the change in 2012 (+0.23F/decade) says it all. We could provide literally hundreds of other examples. Bloggers in many other countries have shown startling examples of fraud.

image
Enlarged

image
Enlarged

ADJUSTMENTS MADE

INFILLING

This is needed when a station is missing data for a month or months. It is accomplished using anomalies. For areas where there are adequate close-by surrounding stations, the assumptions that despite the local temperature differences, most sites will have a similar anomaly (departure from normal) is a reasonable one. But for infilling they can go as far as 1200 km (750miles) away to find data. At longer ranges this become problematic. Take for example northern Canada or the arctic where they must extrapolate over vast distances.

HOMOGENIZATION

This adjustment that blends data for all stations was designed to detect previously undisclosed inhomogeneities (station moves or siting changes) and adjust for urbanization. It may help detect siting discontinuities but is not an adequate substitute for UHI adjustment. The rural stations if properly sited and the Climate Reference network of stations should be reference to adjust the urban stations.

Instead through homogenization the rural areas are contaminated by urban stations, Dr. Edward Long from NASA examined a set of rural and urban stations in the lower 48 states both raw and adjusted. After adjustment, the rural warming rates increased 5 fold while urban warming rates were only slightly reduced. This augmented not eliminated UHI contamination.

image
Enlarged

OCEAN DATA

The other data set that presents a challenge for a precise assessment of global average surface temperature (GAST) is world’s oceans, which cover 71% of the globe.

image
Enlarged

Major questions persist about how much and when to adjust for changing coverage and measurement techniques from buckets to ship intake, to moored and drifting buoys, satellite skin temperature measurements and now ARGO diving buoys.

ARGO network of 3341 diving buoys and floats introduced in 2003 (now 4000) were designed to improve the assessment going forward.

image
Enlarged

But despite the fact this technology was designed specifically for the purpose, the early ARGO buoys, disappointed by showing no confirmation of an upward trend. So the data from the buoys was “adjusted.” John Bates, data quality officer with NOAA admitted “They had good data from buoys...and “corrected” it by using the bad data from ships. You never change good data to agree with bad, but that’s what they did - so as to make it look as if the sea was warmer.”

image
Enlarged

image
Enlarged

That was just the latest example of data manipulation. Initially, this global data had a cyclical pattern similar to previously reported Northern Hemisphere data (high in the 1930s and 40s, low in the 70s).  Then, as time progressed, the previous officially reported GAST data history was modified, removing the cycle and creating a more and more strongly upward sloping linear trend in each freshly reported historical data set.  Peer reviewed, published and readily reproducible research has shown that: “The conclusive findings were that the three GAST data sets are not a valid representation of reality.”

In fact, the magnitude of their historical data adjustments, which removed their cyclical temperature patterns are completely inconsistent with published and credible U.S. and other temperature data. Thus, despite current assertions of record-setting warming, it is impossible to conclude from the NOAA and NASA data sets that recent years have been the “warmest ever.”

For more see here and here.

See detailed Research Report here.

All our efforts are volunteer (pro-bono). Help us with a donation if you can (left column).

Jan 31, 2020
Alaska had a brutally cold January - Fairbanks, it ranked as 15th coldest (records began in 1904)

Joseph D’Aleo, CCM

Alaska had a brutally cold January.

image

In Fairbanks, it ranked as 15th coldest (records began in 1904) with an average of -21.4F in Fairbanks (13.4F below average)

image
Enlarged

In McGrath, it was the 4th coldest - coldest was January of 2012.

image
Enlarged

The lowest temperature in january was -51F.

That was despite warm water in the Gulf of Alaska. The water cooled from the fall levels.

image
Enlarged

Note that 2012 ranked as 5th coldest in Fairbanks and coldest in McGrath. It had very cold water off the coast then and modern day record Bering Sea ice resulted.

image
Enlarged

image
Enlarged

image
Enlarged

Early departure of Bering Sea ice in 2019 due to strong NPAC storms lead to early ocean warming and record early summer high temperatures.

When NCEI has the monthly data, we will repost with graph of Januarys statewide.

Often cold Alaska retards cold in the lower 48.???

See earlier story when heat followed early Bering Sea ice loss in last years’ warmer winter here.

Meanwhile in Saudi Arabia, record cold and snow.

And at Kazakh, capital of Kazakhstan, snow brought deep cold. The cold developed over the deep snowcover, the +NAO had Atlantic flow into Europe and Russia, deflecting the cold air south into Kazakhstan and the Middle East.

image

image

With the cold continuing in the arctic and Alaska, ice has increased for the second year in a row.

image Enlarged

See how warm the arctic was 1920s to 1950s.
image

See the IARC and UAF showed this relates to Atlantic ocean temperatures (the AM0)

image

Jan 24, 2020
Puget Sound Islanders Look for Hope, Climate Solutions

Puget Sound Islanders Look for Hope, Climate Solutions

Islanders Look for Hope, Climate Solutions, in the January 22 Vashon-Maury Island Beachcomber link

RESPONSES:

Washington’s Vashon Island residents may be concerned when they hear reports that the world isn’t doing enough on climate change, but they would be very relieved to be informed that the NOAA National Climatic Data Center Climate at a Glance website reports the official Washington climate change data indicating that:

Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 0.2 degrees F per decade during the last 30 winters from 1990 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.

Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 0.6 degrees F per decade during the last 20 winters from 2000 to 2019, even as Washington-s atmospheric CO2 concentrations have continued to increase.

Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 1.5 degrees F per decade during the last 10 winters from 2010 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.

Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at a rate of 13.2 degrees F per decade during the last 5 winters from 2015 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase!

Meteorological winter (December - February) temperatures in Washington’s Puget Sound Lowlands Climate Division have officially trended downward at ever-increasing downward rates during the last 30 winters from 1990 to 2019, even as Washington’s atmospheric CO2 concentrations have continued to increase.

Dr. Gordon Fulks replied to the newspaper:

The best way to deal with climate hysteria is to learn something about our climate.  As soon as you do, you realize that it is far more complex than those in the media maintain.  It is certainly not a one parameter (carbon dioxide fits all) situation.

In fact carbon dioxide appears to have its greatest negative effect when the level drops below 200 ppmv, and plant life starts to die.  That turns high deserts, like the Gobi in China, into wastelands with vast dust storms that sweep the planet for tens of thousands of years.  That turns continental glaciers that cover large portions of the Northern Hemisphere darker, such that they begin to melt.  During a Milankovitch Cycle ‘Great Summer’ lasting 5,000 years, the Earth is then able to break free of a 100,000 year Ice Age and bring us into an Interglacial Period like the present Holocene Climate Optimum.  Those generally last about 10,000 years, before another ‘Great Winter’ drags us into the next Ice Age.  When that happens, Vashon Island will be again covered by an ice sheet thousands of feet thick!

But recent legitimate science (not the Global Warming Bad Science) suggests that we may have 50,000 years before the next Ice Age, due to both decreasing Obliquity (decreasing tilt of the Earth’s axis) and decreasing orbital Eccentricity.  The giant planet Jupiter, more than two and a half times as massive as all of the other planets in the solar system combined, causes these Milankovitch Cycles.

As to Greenland melting and other climate hysteria, let me point out that Greenland has always lost ice into the North Atlantic as RMS Titanic discovered a hundred years ago.  And Greenland just set an all-time record low of -87 F at the National Science Foundation site of Summit Camp.

If you want more authoritative information on our climate, you should visit such sites as wuwt, icecap.us, and co2coalition.  I am one of the unpaid Directors of the CO2 Coalition, along with Princeton Professor of Physics Will Happer, Greenpeace co-founder Dr. Patrick Moore, and Dr. Harrison Schmitt, the only scientist to have walked on the moon.

Gordon J. Fulks, PhD (Physics)

Corbett, Oregon USA

------------

Benefits of CO2 - Freeman Dyson

And Patrick Moore

Dec 20, 2019
Democrat energy ‘plans’ will cause household energy bills to “more than quadruple”. More bad ideas!

Joseph D’Aleo, CCM

MUST READ BACKGROUND ON THE PLANS AFOOT here written by Zuzana Janosova Den Boer, who experienced Communist rule in Czechoslovakia before coming to Canada.

And this post from Greenpeace co-founder, Dr. Patrick Moore.

--------------------------

US Chamber of Commerce projects $130 a barrel oil prices from a Fracking Ban and implementation of any of the Democrat plans they clam will put an end to heat waves, floods, droughts, hurricanes, tornados, wildfires, sea level rises.

The U.S. Chamber of Commerce warned about the implications of banning fracking ahead of Thursday night’s Democratic debate.

If Bernie Sanders or Elizabeth Warren fulfill their pledges to ban fracking upon becoming president in 2021, it would cause natural gas prices to rise by 324%, causing household energy bills to “more than quadruple,” the Chamber projected in a new report.

By 2025, drivers would pay twice as much at the pump for gasoline as oil prices spike to $130 per barrel.

A fracking ban would eliminate 19 million jobs and reduce GDP by $7.1 trillion by 2025. Most of the job losses would occur in Texas, home to the oil-and-gas rich Permian Basin, where more than 3 million jobs would be affected.

Oil and gas production is also a significant contributor to federal, state and local revenue, the Chamber notes. Tax revenue at the local, state, and federal levels would decline by nearly a combined $1.9 trillion if a Democrat bans fracking.

Not great for emissions either: Banning fracking would have a questionable impact on emissions.

In the near term, coal use might increase to offset the loss of electricity from natural gas plants. That could increase emissions overall, even if fracking limits lowered emissions of methane and raised the price of oil.

It’s also challenging to replace gas use from buildings and in the manufacturing sector immediately, so that would likely require importing more fossil fuels.

The economy in every country that has moved down an extreme green path have seen skyrocketing energy costs - 3 times our levels.

Renewables are unreliable as the wind doesn’t always blow nor the sun shine. And don’t believe the claims millions of green jobs would result. In Spain, every green job created cost Spain $774,000 in subsidies and resulted in a loss of 2.2 real jobs. Only 1 in 10 green jobs were permanent.  Industry left and in Spain unemployment rose to 27.5%.

Tom Steyer is a hypocrite having made his billions from trading coal. He is pushing the globalist agenda and pretending he is just a simple family man and he and his wife started a fund to help people and has come to believe climate change is the greatest threat the world faces. He says his green plan would create millions of great jobs. That has not worked out anywhere it has been tried and the poor have suffered the most.

image

Many households in the countries that have gone green are said to be in “energy poverty” (25% UK, 15% Germany). The elderly are said in winter to be forced to “choose between heating and eating”. Extreme cold already kills 20 times more than heat according to a study of 74 million deaths in 13 countries.

Politicians in the northeast states are bragging that they stopped the natural gas pipeline, shut down nuclear and coal plants and blocked the northern Pass which would have delivered low cost hydropower from Canada. In Concord, they are now scurrying to try and explain why electricity prices are 50 to 60% higher than the national average here and are speculating they have not moved fast enough with wind and solar.  Several states have even established zero carbon emissions. This will lead to soaring energy prices and life-threatening blackouts. For a family of 4 in a modest house with 3 cars, the energy costs could increase over $10,000/year. And by the way like in Europe where this plan was enacted, many will lose their jobs.

Prosperity always delivers a better environment than poverty.

“If you don’t know where you are going, you might end up somewhere else” Yogi Berra

------------

Dems seek to squash suburban, single-family house zoning, calling it racist, bad for environment

Virginia House Del. Ibraheem Samirah introduced a bill that would override local zoning officials to permit multi-family housing in every neighborhood, changing the character of quiet suburbs. Oregon passed a similar bill, following moves by cities such as Minneapolis; Austin, Texas; and Seattle. Proponents say urban lifestyles are better for the environment and that suburbs are bastions of racial segregation.

Democrats in Virginia may override local zoning to bring high-density housing, including public housing, to every neighborhood statewide - whether residents want it or not.

The measure could quickly transform the suburban lifestyle enjoyed by millions, permitting duplexes to be built on suburban lots in neighborhoods previously consisting of quiet streets and open green spaces. Proponents of “Upzoning” say the changes are necessary because suburbs are bastions of segregation and elitism, as well as bad for the environment.

The move, which aims to provide “affordable housing,” might be fiercely opposed by local officials throughout the state, who have deliberately created and preserved neighborhoods with particular character - some dense and walkable, others semi-rural and private to accommodate people’s various preferences.

But Democrats tout a state-level law’s ability to replace “not in my backyard” with “yes, in your backyard.”

House Delegate Ibraheem Samirah, a Democrat, introduced six housing measures Dec. 19, coinciding with Democrats’ takeover of the state legislature in November.

“Single-family housing zones would become two-zoned,” Samirah told the Daily Caller News Foundation. “Areas that would be impacted most would be the suburbs that have not done their part in helping out.”

“The real issues are the areas in between very dense areas which are single-family zoned. Those are the areas that the state is having significant trouble dealing with. They’re living in a bubble,” he said.

He said suburbs were “mostly white and wealthy” and that their local officials - who have historically been in charge of zoning - were ignoring the desires of poor people, who did not have time to lobby them to increase suburban density.

In response to a question about whether people who bought homes in spacious suburbs have valid reasons, not based on discrimination, for preferring to live that way - including a love for nature and desire to preserve woods and streams ‘ he said: “Caring about nature is very important, but the more dense a neighborhood is, the more energy efficient it is.”

He said if local officials seek to change requirements like setbacks to make it impossible to build dense housing in areas zoned to preserve a nature feel, “if they make setbacks to block duplexes, there’d have to be a lawsuit to resolve whether those zoning provisions were necessary.”

He wrote on Facebook, “Because middle housing is what’s most affordable for low-income people and people of color, banning that housing in well-off neighborhoods chalks up to modern-day redlining, locking folks out of areas with better access to schools, jobs, transit, and other services and amenities.”

“I will certainly get pushback for this. Some will call it ‘state overreach.’ Some will express anxiety about neighborhood change. Some may even say that the supply issue doesn’t exist. But the research is clear: zoning is a barrier to more housing and integrated communities,” he continued.

He tweeted Sunday that that would include public housing. “Important Q about new social/public housing programs: where are we going to put the units? Under current zoning, new low-income housing is relegated to underinvested neighborhoods, concentrating poverty more. Ending exclusionary zoning has to be part of broader housing reform,” he said.

Tim Hannigan, chairman of the Fairfax County Republican Committee - in one of the areas Samirah represents -s aid that urban Democrats were waging war on the suburbs.

Residential life, because of the urbanization that would develop,” he told the DCNF. “So much of the American dream is built upon this idea of finding a nice quiet place to raise your family, and that is under assault.”

“This is a power-grab to take away the ability of local communities to establish their own zoning practices… literally trying to change the character of our communities,” he said.

He said suburbs were not equipped to handle the increased traffic, and “inevitably it will just push people to places where they feel they’ll get away from that, they may move to West Virginia to get their little plot of land.”

Minneapolis became the first city to eliminated single family zoning in December 2018, after a push by progressive advocacy groups promoting “equity.” Austin, Texas, and Seattle soon followed suit.

But those cities were amending zoning codes that have always been the domain of local governments. Oregon passed state legislation blocking local governments[ single-family zoning in July, CityLab reported.

It quoted Alex Baca, a Washington, D.C., urbanist with the site Greater Greater Washington, saying that single-family zoning is a tool for wealthy whites to maintain segregated neighborhoods and that the abolition of low-density neighborhoods is necessary for equity.

CityLab acknowledged that “residents might reasonably desire to keep the neighborhoods they love the way they are,” but said that implementing the law at the state level makes sure that those concerns can be more easily ignored.

“By preempting the ability of local governments to set their own restrictive zoning policies, the state policy would circumnavigate the complaints of local NIMBY homeowners who want to block denser housing,” it wrote.

While he implied that suburbs are prejudiced, Samirah himself has a history of anti-Semitic comments, including saying sending money to Israel is worse than funding the Klu Klux Klan.

“I am so sorry that my ill-chosen words added to the pain of the Jewish community, and I seek your understanding and compassion as I prove to you our common humanity,” he said in February.

He interrupted a speech in July by President Donald Trump in Jamestown, Virginia, and said, “You can’t send us back! Virginia is our home.”

His father is Jordanian refugee Sabri Samirah, who authorities banned from the U.S. for a decade after the Sept. 11, 2001 attacks, in part because of his membership in the Muslim Brotherhood, the Chicago Tribune reported in 2014.

-----------

This is just part of a master plan slowly being put in place across the world. Remember this recent story about Ireland.

Page 2 of 288 pages  <  1 2 3 4 >  Last »