Another Wrong Forecast
Over the past couple of months I’ve written several articles that explain why I do not believe in the theory that an increase of CO2 in the atmosphere will cause catastrophic warming. Up until now at least, there has been no measurable evidence to support such a theory. The theory has come about due to computer models. But the computer models have been fine-tuned to bad data, have been programmed with false assumptions and as a result have produced incorrect forecasts.
In an article two weeks ago I wrote that, “One of the main tenants of global warming theory is that if greenhouses gases are warming the planet, that warming will happen first in the layer of air 20,000-40,000 ft above the tropics. All 20-odd-climate models predict warming there first—it’s the fingerprint of greenhouse gas warming, as opposed to warming by some other cause. The hotspot is not incidental to IPCC climate theory—it lies at its heart…” The evidence shows that this hotspot is missing, indicating that the global warming theory is wrong.
There is another way to look at the computer model forecasts as shown by the following graph. This graph may look a little intimidating at first but let me walk you through it.
The light grey line represents the actual temperatures as measured by satellite. The data is from Remote Sensing Systems of Santa Rosa, California and is very similar to the satellite temperature data from the University of Huntsville, Alabama. You can see the cooling that occurred in 1992 from the eruption of Mt. Pinautbo and the large spike up in temperatures in 1998 from the very strong El Niño.
The bold black line represents the computer model forecasts of the temperatures out beyond the year 2020. Two lighter black lines represent the outer limits of the uncertainty in the forecast. In other words, the forecast temperature could fall anywhere within the outer two lines but is most likely near the bold black line. The solid red line is the actual trend in the temperatures and the dashed red lines indicate the limits of the uncertainty in the measured temperatures.
The graph has been set to 1996. When the models are run backwards from that time, they do a fairly good job of reproducing the temperatures back to 1980, even showing the volcanic cooling, because it was programmed into the model. But when the models are run into the future from 1996, they produce too much warming. The actual measured temperature trend, shown by the solid red line is well below the forecast trend shown by the solid black line. In fact, even all of the possible errors in the measured trend are lower than the possible forecast errors. The models are wrong. How many more years will be needed of cooler temperatures than forecast before the alarmists acknowledge this fact?
The computer models also forecast sea surface temperatures as shown in this graphic.
The observed sea surface temperatures are plotted in blue with the computer simulation in red. You can see that the trend line of the model forecast in red is warmer than the actual measured sea surface temperatures. The models are once again forecasting too much warming.
A new paper just published in The Journal of Forecasting concludes, ““The scientific community of global climate modelers has surely taken unnecessary risks in raising the stakes so high when depending on forecasts and models that have many weaknesses.” In addition, “the primary emphasis on controlling global CO2 emissions is misguided”.
Misguided indeed. We are spending billions of dollars we don’t have each year on a problem that only exists in flawed computer models.
Craig James has been retired since July 1, 2008, after 40 years of broadcasting television weather. He was chief meteorologist at WZZM-TV for 12 years and chief meteorologist at WOOD-TV for 24 years. He is a graduate of Penn State University, where he received a Centennial Fellowship Award. He was also honored as a Fellow of the American Meteorological Society.