Global Temperatures Global Troposphere Temperatures Average click here For full size []]
Providing Insight
Into Climate Change
Climate Change Science Essay page 7

The IPCC Hockey Stick

The IPCC published the "Hockey Stick" graph from Mann, Bradley and Hughes (MBH 1998), in its Third Assessment Report, which shows little change in temperatures for hundreds of years then a sharp increase recently in the last hundred years. This temperature history was given bold prominence in the IPCC reports, distributed to all Canadian households and used to support major policy decisions involving the expenditure of billions of dollars. The IPCC argues that there was little natural climate change over the last 1000 years, so that the temperature change over the last 100 years is unusual and likely caused by human activities. A senior IPCC researcher said in an email "We have to get rid of the Medieval Warm Period." Christopher Monckton says "They did this by giving one technique, measurement of tree-rings from bristlecone pines, 390 times more weighting than other techniques but didn't disclose this. Tree-rings are wider in warmer years, but pine tree rings are also wider when there's more carbon dioxide in the air: it's plant food. This carbon dioxide fertilization distorts the calculations. They said they had included 24 data sets going back to 1400. Without saying so, they left out the set showing the medieval warm period, tucking it into a folder marked "Censored Data". They used a computer model to draw the graph from the data, but two Canadians [Ross McKitrick and Stephen McIntyre] later found that the model almost always drew hockey-sticks even if they fed in random, electronic "red noise" because it used a faulty algorithm."  The MBH 1998 report was never properly peer reviewed before the IPCC used it in their publications. 
See here for comments from Christopher Monckton.

McKitrick and McIntyre say in their paper "the dataset used to make this construction contained collation errors, unjustified truncation or extrapolation of source data, obsolete data, incorrect principal component calculations, geographical mislocations and other serious defects. These errors and defects substantially affect the temperature index. The major finding is that the values in the early 15th century exceed any values in the 20th century. The particular hockey stick shape derived in the MBH98 proxy construction a temperature index that decreases slightly between the early 15th century and early 20th century and then increases dramatically up to 1980 is primarily an artefact of poor data handling, obsolete data and incorrect calculation of principal components."  See here for their paper.

The IPCC hockey stick is shown below, along with the corrected version. The error ranges are not shown here.

Corrected hockey stick

The dispute over the hockey stick caused the United States Congress to decide to investigate the matter. The US National Research Council (NRC) held public hearings and prepared a report in 2006 for the US House of Representatives Committee on Science.  The NRC Report made no criticism of the McKitrick and McIntyre papers. The report concludes "strip-bark samples should be avoided in temperature reconstructions." These strip-bark Bristlecone/Foxtail samples are responsible for the sharp increase in the graph in the twentieth century, but the growth spurt is not related to temperatures. It also confirmed that Mann's algorithm, which used non-centered principal component analysis, mines for hockey stick shapes from random red noise data as previously shown by McKitrick and McIntyre, and notes that "uncertainties of the published reconstructions have been underestimated."

Meanwhile, the US House of Representatives Committee on Energy and Commerce had independently commissioned a study from Edward Wegman who is chairman of the NAS Committee on Applied and Theoretical Statistics and a Fellow of the Royal Statistical Society. The Wegman Report states "Overall, our committee believes that Manns assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis. It also states "In general, we find the criticisms by [the McKitrick and McIntyre papers] to be valid and their arguments to be compelling. We were able to reproduce their results and offer both theoretical explanations (Appendix A) and simulations to verify that their observations were correct. The study also studied the social network of the group of scientists who publish temperature reconstructions. The study found that they collaborate with each other and share proxy data and methodologies, so that the "independent" studies are not independent at all. See the Wegman Report here.

Both of these reports were public six months before the IPCC began the release of the Fourth Assessment Report; however, the 4AR makes no mention of the Wegman Report, gives only one citation of the NRC Report, and ignores the findings and recommendations of the reports.

David Holland wrote a comprehensive history and discussion of the hockey stick affair.  See Holland's paper - "Bias and Concealment in the IPCC Process: The 'Hockey Stick' Affair and its Implications" published by "Energy & Environment", October 2007 here.

David Holland says "it is scandalous that the WGI Chapter 6 authors ignored most of its [NRC Report]  substantive findings.  Despite the clear analysis in Wegman et al. showing the lack of independence between the various temperature reconstructions, the authors of AR4 WGI Chapter 6 persisted with their reliance on a spaghetti diagram of reconstructions in Figure 6.10(b) to continue to justify the claim that Average Northern Hemisphere temperatures during the second half of the 20th century were likely the highest in at least the past 1,300 years.

 

Urban Heat Island Effects

The urban heat island effect is the effect that humans have on local surface temperature so that the temperatures in or near urban centres are warmer than rural areas. It is caused caused by the heat-retaining properties of concrete and asphalt in urban areas, the turbulent mixing of the near-surface air layer by buildings and the siting of temperature sensors near artificial heat sources.

                        Surface Temperature Trends in 47 California Counties

Surface Temperature Trends in 47 California Counties

This graph shows the size of the effect on surface temperatures and the problems associated with objective sampling. The surface temperature trends determined from ground stations for the period 1940 to 1996 were averaged for each county. The trends were grouped by county population and plotted as closed circles along with the standard errors of their means. The straight line is a least-squares fit to the closed circles. The points marked ''X'' are the six unadjusted station records selected by NASA GISS for use in their estimate of global temperatures. Note that 5 of the 6 selected stations are in populous counties. Note also that extrapolating the straight line to a county population of 10,000 gives a temperature trend of zero. See here.

Here is an example of a weather station used by the IPCC to record temperature rise.

                            Temperature Trends of Major City Sites and Rural Sites

Temperature Trends of Major City Sites and Rural Sites

Peterson (2003)  is an influential study cited by IPCC Fourth Assessment Report purporting to show that the urbanization effect is negligible.

The IPCC  relied heavily on this flawed study, where Peterson states "no statistically significant impact of urbanization could be found in annual temperatures." However, Steve McIntyre using Peterson's data shows that "actual cities have a very substantial trend of over 2 deg C per century relative to the rural network - and this assumes that there are no problems with rural network - something that is obviously not true since there are undoubtedly microsite and other problems." Peterson uses two lists of stations in his study, one labelled Urban and one labelled Rural. However an analysis of the lists shows that the Urban list includes many rural sites and the Rural list includes many urban sites. These results are discussed in a Climate Audit article here.

Most scientist agree that many temperature station measurements are contaminated by urban heat island effects, but they argue that the major global temperature indexes are adjusted to correct for these effects. There is an "Urbanization Adjustment" to correct for the effects of urbanization, a "Time of Observation Bias Adjustment" to correct for changed to the time of day when measurements are taken, and there is a "Coverage Adjustment" to account for the loss of measurement stations. These adjustments are intended to produce a record of what the temperatures would be if nobody lived near the measurement stations. If the adjustments were adequate, there should be no statistically significant correlation between the temperature record and social economic indicators.

Ross McKitrick and Patrick Michaels published a paper in 2004 in which they analyse the pattern of warming over the Earth's land surface compared to local economic conditions. They found a statistically significant correlation between the adjusted temperature data and economic development, meaning that the adjustments are not adequate to remove the urban heat island effects. They conclude "If the contamination were removed, we estimated the average measured warming rate over land would decline by about half."

Dutch meteorologists, Jos de Laat and Ahilleas Maurellis using different testing methodologies came to similar conclusions. They showed that there is a statistically significant correlation between the spacial pattern of warming in the adjusted temperature data and the spacial pattern of industrial development. They concluded it adds a large upward bias to the measured global warming trend. They also show that climate model predictions show no correlation between temperature and industrial development.

The IPCC acknowledges the correlation between the warming trends and social economic development, but dismisses it as a mere coincidence, due to unspecified atmospheric circulation changes. This nonsense claim contradict the IPCC widley advertised claim that recent warming can not be attributed to natural causes, and the Laat and Maurellis research shows it to be false.

McKitrick and Michaels published an updated paper in December 2007 using a larger data set with a more complete set of socioeconomic indicators. They discussed two types of contamination; anthropogenic surface processes, which are changes to the landscaped due to urbanization or agriculture, and inhomogeneities, i.e. equipment changes, missing data, poor quality control, etc. They showed that the spatial pattern of warming trends is tightly correlated with indicators of economic activity.  They present a battery of statistical tests to prove that the result is not a fluke or spurious correlation. They conclude "The average trend at the surface in the post-1980 interval would fall from about 0.30 degrees (C) per decade to about 0.17 degrees." Removing the net warming bias due to urban heat effects in surface temperature data could explain as much as half the recent warming over land.

                                       Bias of IPCC Temperature Data

Bias of IPCC Temperature Data

The graph above is from the McKitrick and Michaels December 2007 paper. Each square is colour-coded to indicate the size of the local bias. Blank areas indicate that there was no data available. See the Background Discussion on the paper here.

An audit by researcher Steve McIntyre reveals that NASA has made urban adjustments of temperature data in its GISS temperature record in the wrong direction. NASA has applied a "negative urban adjustment" to 45% of the urban station measurements (where adjustments are made), meaning that the adjustments make the warming trends steeper. The urban adjustment is supposed to remove the effects of urbanization, but the NASA negative adjustments increases the urbanization effects. The result is that the surface temperature trend utilized by the International Panel on Climate Change (IPCC) is exaggerated. See here.

The website www.surfacestations.org was created by Anthony Watts in response to the realization that very little physical site survey data exists for the entire United States Historical Climatological Network (USHCN) of surface stations. Volunteers do hands on site surveys to photograph and document all 1221 USHCN climate stations in the USA. As of February 2009, 854 of 1221 stations have been examined in the USHCN network. Each site is assigned a site quality rating 1 through 5 based on the Climate Reference Network Rating Guide. Only 11% of stations are in suitable locations, 69% are within 10 m of an artificial heat source. Below is a picture of a poorly situated station.

Ohio poorly situated station

 

The website Climate4you has many graphs of the urban heat island effect. The graphs were plotted from temperature traverses made by a vehicle traveling across cities.

Oslo UHI experiment

The graph above show temperature measurements taken driving from west to east through the city of Oslo, Norway. The Oslo heat island effect during this experiment was about 8 deg. C. See here.

A study by Anthony Watts evaluated the warming trends of NOAA compliant and non-compliant temperature monitoring stations using the recently WMO-approved Siting Classification System. The analysis demonstrates that reported 1979-2008 U.S. temperature trends are spuriously doubled. The new improved assessment, for the years 1979 to 2008, yields a trend of +0.155C per decade from the high quality sites, a +0.248 C per decade trend for poorly sited locations, and a trend of +0.309 C per decade after NOAA adjusts the data, as shown in the graphic below.

NOAA US adjustment

The graph "Surface and Troposphere Temperature Trends" presented in the Heating of the Troposphere section of this essay shows temperature trends of the land, of the land and sea, and of the troposphere in the tropics. The land surface temperature trend has the highest rate of increase because it is contaminated by the heat island effect. The land and sea surface temperature trend is lower than the land trend because the sea temperature data does not have any heat island effect. The troposphere shows the lowest rate of temperature increase. We know that the CO2 theory of climate change requires the troposphere to warm faster than the surface, but the opposite has happened. It is illogical to believe that CO2 is the primary temperature driver and concurrently believe that the surface measurements used to the IPCC are accurate. If the surface temperature data were fully adjusted to remove the effects of urbanization by reducing the warming rate by half, it would closely match the troposphere warming trend.

A study (Murray & Heggie 2016) compared the national energy consumption (which is converted to heat) to average national temperatures for the United Kingdom and Japan. 

UK energy consumption vs temperature

The chart on the left shows the climate models do a very poor job of predicting temperatures in the U.K region (r2 = 0.10). The right chart shows energy consumption explains measured temperatures very well (r2 = 0.89). The abstract says “It is clear that the fluctuation in [temperature] are better explained by energy consumption than by present climate models.” See the paper here. This provides further evidence that the major temperature indexes used to track climate change are contaminated by the effects of economic development, biasing estimates of climate sensitivity and social costs of CO2 emissions high.

Falsified Historical CO2 Measurements

The IPCC uses a CO2 concentration history that shows a low pre-industrial CO2 content which increases during the industrial era. The IPCC may have used corrupted CO2 data in its analysis of climate change. Their conclusions and projections of climate change are all based on the assumption of low CO2 concentrations in the pre-industrial atmosphere based on ice core studies. Unfortunately, ice cores do not form a closed system. In the highly compressed deep ice, CO2 combines with liquid water to form gas hydrates, or clathrates, which are tiny crystals. When the ice core is brought to the surface, the pressure falls causing the clathrates to decompose to the gas form, exploding in the process as if they were microscopic grenades, forming tiny cracks in the ice. Other cracks are formed by the ice decompression. Gas escapes through these cracks as the ice core is brought to the surface, but since CO2 forms clathrates at lower pressures than other gases, CO2 is preferentially lost leading to depletion of CO2 in the gas trapped in the ice core. Consequently, the measured CO2 concentration from deep ice cores is less than the CO2 concentration of the originally trapped air.

Atmospheric CO2

 

 

 

The graph on the left shows the IPCC history of CO2 concentration in air.





Data from shallow ice cores such as from Siple, Antarctica, show that the CO2 concentration of pre-industrial ice (from depths too shallow for clathrate formation) are much higher than that measured at Mauna Loa, Hawaii in 1960.

Actual Siple, Antarctica Ice Core and Mauna Loa Data

Actual Siple, Antarctica Ice Core and Mauna Loa Data

Note that the measured concentration declines with increasing load pressure and depth.

Shifted Siple, Antarctica Ice Core and Mauna Loa Data

Shifted Siple, Antarctica Ice Core and Mauna Loa Data

 

As the actual measurements show ice deposited in 1890 AD is 328 ppm, not the 290 ppm required to fit the IPCC human caused increasing CO2 concentration and global warming hypothesis, the average age of air was arbitrarily decreed to be exactly 83 years younger than the ice in which it was trapped.

The corrected ice data were then smoothly aligned with the Mauna Loa record, and reproduced in countless publications as a famous Siple curve. Only thirteen years later, in 1993, glaciologists attempted to prove experimentally the age assumption, but they failed.



        CO2 Measurements between 1800 and 1955

CO2 Measurements between 1800 and 1955

 

IPCC modellers ignored the direct measurements of CO2 concentration indicating that the 19th century CO2 concentration was 335 ppm.

The encircled values were arbitrarily selected by Callendar for estimation of 292 ppm as the average 19th century CO2 concentration.

A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppm, and 9600 years ago 348 ppm, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution.
See here for more information.

Recently, Ernst-Georg Beck has summarized 90,000 accurate chemical analysis of CO2 in air since 1812. The historic chemical data reveal that changes in CO2 track changes in temperature, and therefore climate in contrast to the simple, monotonically increasing CO2 trend depicted in the post 1990 literature on climate change. Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.

Beck 2007 CO2 measurements

Between 1857 and 1958, the Pettenkofer process was the standard analytical method for determining atmospheric carbon dioxide levels, and usually achieved accuracy better than 3%. These determinations were made by several scientists of Nobel Prize level distinction. Following Callendar (1938), modern climatologists have generally ignored the historic determinations of CO2, despite the techniques being standard textbook procedures in several different disciplines. Chemical methods were discredited as unreliable choosing only few which fit the assumption of a climate CO2 connection.

Ernst-Georg Beck calls the falsification of the CO2 record "The greatest scandal in the modern history of science".

See here for a summary of the Beck paper, or here for the paper

See here for Beck's Berlin presentation of May 30, 2007.

See here for CO2: The Greatest Scientific Scandal of Our Time, by Zbigniew Jaworowski, Spring/Summer 2007 21st CENTURY Science & Technology.

 In January 2009, a Japanese group launched a satellite IBUKI to monitor CO2 and methane spectral bands around the world to establish exactly where the worlds biggest sources and sinks of greenhouse gases were. The results from from Japans Aerospace Exploration Agency (JAXA)  show that Industrialized nations appear to be absorbing the carbon dioxide emissions from the Third World. The satellite data shows that levels of CO2 are typically lower in developed countries than in air over developing countries. Areas with higher net emission (man-made plus natural emissions less natural absorption into sinks) would show higher CO2  concentrations. CO2 levels are lower than average in industrial countries as indicated by the blue dots. The highest net emissions, at least on this graph are predominantly in China, and central Africa. See here.

CO2 map

 

No Consensus

Author Michael Crichton warned of the dangers of "consensus science" in a 2003 speech. He says "Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus."

In an open letter to Prime Minister Stephen Harper, 61 prominent scientists called for an open climate science review. The letter states "Observational evidence does not support today's computer climate models, so there is little reason to trust model predictions of the future. Significant advances have been made since the protocol was created, many of which are taking us away from a concern about increasing greenhouse gases. If, back in the mid-1990s, we knew what we know today about climate, Kyoto would almost certainly not exist, because we would have concluded it was not necessary. Global climate changes all the time due to natural causes and the human impact still remains impossible to distinguish from this natural "noise.""

The Petition Project was organized by the Oregon Institute of Science and Medicine.
The petition states in part:
 "There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gasses is causing or will, in the foreseeable future, cause catastrophic heating of the Earth's atmosphere and disruption of the Earth's climate. Moreover, there is substantial scientific evidence that increases in atmospheric carbon dioxide produce many beneficial effects upon the natural plant and animal environments of the Earth."

So far (May 2009) the petition has received 31,478 signatures. Signatories are approved for inclusion in the Petition Project list if they have obtained formal educational degrees at the level of Bachelor of Science or higher in appropriate scientific fields. All of the listed signers have formal educations in fields of specialization that suitably qualify them to evaluate the research data related to the petition statement. Many of the signers currently work in climatological, meteorological, atmospheric, environmental, geophysical, astronomical, and biological fields directly involved in the climate change controversy.  See here.

The Heartland Institute has conducted an international survey of 530 climate scientists in 2003. The survey asked if the current state of scientific knowledge is developed well enough to allow for a reasonable assessment of the effects of greenhouse gases. Two-thirds of the scientists surveyed (65.9 percent) disagreed with the statement, with nearly half (45.7 percent) scoring it with a 1 or 2, indicating strong disagreement. Only 10.9 percent scored it with a 6 or 7, indicating strong agreement. See here for the full survey results.

In an Open Letter to the Secretary-General of the United Nations, and the head of states of many nations dated December 13, 2007, titled "UN Climate Conference Taking the World in Entirely the Wrong Direction", more than 100 specialists from around the world, many who are leading scientists, state that "It is not possible to stop climate change, a natural phenomenon that has affected humanity through the ages." The letter states than recent climate changes have been well with-in the bounds of known natural variability. It further states that climate models can not predict climate, that there has been no global warming since 1998, that the IPCC has ignored much significant new peer-reviewed research that has cast even more doubt on the hypothesis of dangerous human-caused global warming, and attempts to cut emissions will slow development, and is likely to increase human suffering from future climate change rather than to decrease it.  See here for the letter as published by the National Post.

A report to the US Senate lists 400 qualified scientists from around the world who dispute the claims by IPCC and others, that "climate science is settled" and that there is a "consensus". See here.

There is no consensus on whether or to what degree human activities are causing the problem, or even whether there is a problem. Global cooling, widely predicted in the 1970s, would have been much more dangerous than warming.

Click NEXT to continue

Click RETURN to go to the table of contents.


web design & development by: