Climate Data, Part II
by CRAIG JAMES
If you didn’t see my article from last week, I showed this chart from NASA’s Goddard Institute of Space Studies supposedly showing about .8 degree C (1.4 degree F) warming across the globe since 1880. I think this chart is pure fantasy. Here is more on why.
Last week I wrote about the United States Historical Climate Network (USHCN) and the problems with bad location of thermometers on or near tarmacs, next to buildings, on paved driveways and roads, in waste treatment plants, on rooftops, near air conditioner exhausts and more. In addition, there are problems with the adjustments made for the urban heat island effect, changing thermometer locations and thermometer calibration. This article looks at the Global Historical Climate Network (GHCN).
Back in the 1970s, there were about 6,000 climate-reporting stations in the GHCN, but that number had dropped to around 1,500 in 1990 and to a little over 1,000 now. That is the entire number of land-based surface observations used in calculating the global temperature. Temperature readings are still taken at most of the original stations, but for some reason, they have been deleted from the database.
A computer expert by the name of E. Michael Smith has done an exhaustive analysis of which stations have disappeared from the record and how the remaining data has been manipulated. You can read the details at his website at chiefio.wordpress.com/2009/07/30/agw-basics-of-whats-wrong/.
To summarize, it appears that stations placed in historically cooler, rural areas of higher latitude and elevation were deleted from the data series in favor of more urban locales at lower latitudes and elevations, which are of course warmer. Consequently, readings after 1990 have been biased to the warm side not only by selective geographic location, but also by the influence of the urban heat island effect.
For example, guess how many climate stations are now in the GHCN database for California? Just four! They are San Francisco, Santa Maria, Los Angeles, and San Diego. These are all coastal, urban and low-elevation stations. All of the high-elevation, rural and cold thermometers have been eliminated.
In Canada, the number of reporting stations dropped from 496 in 1989 to just 44 in 1991 with only one—that’s right, just one—station north of 65 degrees latitude. Russia has all of three stations left in the database north of 65 degrees. Who needs real data from the Arctic when you can just estimate it?
In both Canada and the United States, almost all cold, high-elevation stations have been dropped. In Japan, there is only one reporting station above 1,000 feet. Bolivia has no reporting stations. The entire country has had no temperature data included in the GHCN database since 1990. There are many additional examples too numerous to list.
The systematic removal of many of the cold reporting stations from the recent record, while leaving in those stations from older records, has the result of making the present look warmer than it is and making the past look colder. If you want to show warming, that’s certainly a way to do it, although it is in reality nothing less than fraudulent.
Finally, as E. Michael Smith has written, “Once the data are collected, they are subject to strange and wondrous changes and manipulations. The exact methods are more or less secret. The changes are conducted by people who often have their entire self-worth and career vested in ‘global warming.’ The results often seem disjointed from observed reality. Where there are details on the adjustment available, they can often be shown to be bogus.”
Compare the two charts shown here that depict the adjustment made to the raw temperature data at Santa Rosa, Calif. You can clearly see how the lowering of temperatures in the first part of the temperature record has resulted in what appears to be a warming trend, when the raw data showed an actual cooling trend.
There are literally hundreds of similar adjustments in both the U.S. and global temperature databases. The data has been so massaged, modified, fudged, factored, tweaked and transmogrified as to no longer represent anything that might be logically referred to as the “instrumental” record. And this is the data used to initialize the computer models. It certainly seems to be a case of “garbage in, garbage out.”
Craig James has been retired since July 1, 2008, after 40 years of broadcasting television weather. He was chief meteorologist at WZZM-TV for 12 years and chief meteorologist at WOOD-TV for 24 years. He is a graduate of Penn State University, where he received a Centennial Fellowship Award. He was also honored as a Fellow of the American Meteorological Society.