By: Ken Gregory, BASc.
CliSci # 448 2026-04-14
The Data Heat Island Effect: Quantifying the Impact of AI Data Centers
A new preprint paper investigates the impact of the continuing proliferation of artificial intelligence (AI) data centers on the local environment focussing on temperature changes. The paper says “Here, we focus our attention on the heat dissipation of AI hyperscalers.” A hyperscale data center is a distributed computing environment and architecture that is designed to provide extreme scalability to accommodate workloads of massive scale. The authors used remote sensing platforms to obtain a robust assessment of the land surface temperature (LST) increase recorded over the last decades (from 2002 to 2024) in the areas surrounding AI data centres globally. LST trends were assessed at over 6,700 data centres located outside of dense urban areas. The abstract says “We estimate that the land surface temperature increases by 2°C on average after the start of operations of an AI data centre, inducing local microclimate zones, which we call the data heat island effect.” Some locations show and increase up to 9°C, and the minimum increase is 0.3°C. The data heat island effect reduces its intensity to 30% within 7 km around the data centres. Data centres are expected to be one of the most power-hungry activities in the next decade. The paper shows the land surface temperature increase induced by the AI data centres with respect to the average LST recorded over those regions for the 5 years prior to each AI data centre start of operations. Using population maps, the authors estimate that up to 343 million people could be affected by the data heat island effects worldwide.
Anthoy Watts wrote this article about the paper. He wrote “The key variable being analyzed is not air temperature in the meteorological sense, but land surface temperature derived from satellite observations. That distinction matters because land surface temperature is extremely sensitive to local surface characteristics. Replace vegetation with buildings, pavement, and industrial equipment, and the measured surface temperature will rise, regardless of whether the underlying atmospheric conditions have changed in any meaningful way.” The classic urban heat island effects (UHIE) typically fall in the range of 4 to 6°C. In that context, the signal around data centres looks like a subset of the classic UHIE.
Social Vulnerability and Mortality Attributable to Non-optimal Temperature in the United States
The objective of this study was to quantify county-level mortality attributable to non-optimal temperature in the United States and examine whether social vulnerability modifies this relationship. The authors analyzer 1514 US counties representing 91% of the 2010 US adult population of ages 25 to 84. Monthly mortality counts for the period 2000 to 2020 were obtained from the U.S. Centers for Disease Control and Prevention The minimum mortality temperature (MMT) was determined and deaths were classified as heat-related or cold-related. Death rates increased at temperatures greater and less than the MMT. The US pooled MMT was 22.7 °C. Most attributable deaths occurred at temperatures below the MMT. The study reported that nationally, an estimated 72,361 cold-attributable and 6,129 heat-attributable deaths occurred annually, equivalent to 40.1 and 3.4 deaths per 100,000 person-years. This is 11.8 cold-related deaths per heat-related death. The most vulnerable counties have a 40% higher death rate, compared to the least vulnerable counties. These counties also experienced higher MMTs.
The Science News section of the March 2026 Friends of Science Quarterly Newsletter featured a review of a similar study of temperature-related deaths in the U.S., but reporting on only cardiovascular disease (CVD) deaths. Both studies were published in first quarter of 2026 but in different journals and they shared the same lead author and six of the contributing authors. The papers used the same data source and analysed deaths at the US county level over 2000 to 2022. The study discussed above of all non-optimal temperature-related deaths reports 11.8 cold to heat-related deaths, while the CVD study reported about 18 cold to heat-related CVD deaths in the US.
New Book: The Frontier of Climate Science: Solar Variability, Natural Cycles and Model Uncertainty
Nicola Scafetta, a climate science scholar at Duke University, has published a new book about climate science. He posted an article about his book on Judith Curry’s blog which provides a synopsis of the book and his purposes for writing it. Scafetta has investigated the interplay between climate dynamics and solar variability in the context of complex systems for over 20 years. In the book’s synopsis, he asks 3 questions:
- How well do we truly understand Earth’s climate?
- What natural forces remain beyond our grasp?
- Is Net Zero the only viable path forward?
The Frontier of Climate Science explores climate dynamics through physics, complex systems, and astronomy, synthesizing several decades of peer reviewed research. The book critically reviews the scientific foundations of modern climate theory, the evolution of IPCC assessments, and the limits of global climate models (GCMs) when confronted with observations. From this evidence emerges a balanced view of climate risk, favoring pragmatic adaptation over narrowly defined policy pathways such as Net Zero.
The Met Office Is Inflating UK Maximum Temperature Records
Chris Morrison wrote for the DailySceptic “Convincing statistical proof has emerged over the last year to show that the UK Met Office is inflating maximum temperature readings to create Net Zero-supporting climate alarm. Over the last 30 years, the Met Office has produced the vast majority of its data from unnaturally heat-ravaged ‘junk’ sites using newly-installed accurate electronic devices able to record one minute heat spikes.” Dr. Eric Huxter examined 340 daily maximum temperature highs recorded across 96 Met Office stations and discovered that these sites showed average short heat spikes around 1.1 °C. He compared the heat spike averages of these stations with a pristine CIMO Class 1 control station. A CIMO Class 1 weather station is the highest standard of meteorological siting, defined by the World Meteorological Organization to ensure minimal environmental interference (like buildings or trees). Morrison wrote “Most of these spikes occurred around daily ‘records’ in junk CIMO Class 3, 4 and 5 locations. These sites have internationally recognised ‘uncertainties’ or possible errors of 1°C, 2°C and 5°C respectively.” Huxter reviewed a year of one-minute temperature measurements of the control station. Comparing the control station statistics with the 340 heat spikes at the largely junk sites that triggered the record high temperatures, he found that they were highly different. The junk stations were much more likely to have more and larger heat spikes. The heat spikes may be from nearby jet engines, solar farms or buildings. The electronic temperature sensors move much quicker than the old liquid-in-glass thermometers so they catch brief exaggerated spikes rather than true ambient air temperatures. Class 4 and 5 stations have increased in number over the last few years and now total 81% of the UK’s network of 400 stations.
Scotland’s Energy Crisis – Lack of Inertia
On 9th March 2026, Net Zero Watch brought the energy system expert Kathryn Porter to Edinburgh, to speak to an invited audience ahead of the Scottish elections in May. The speech discusses the nature of a power grid in general and problems with the power grid in Scotland. She said “Before a generator connects to the grid it must match the grid’s voltage, frequency and phase – that is the peaks and troughs of the waves line up. This process is known as synchronisation.” She explains that ‘inertia’ is a property where a conventional power station resists the changes in electrical alternating current frequency. The large mass of the rotating turbines and generators resist changes of the rotation speed and help to keep the frequency stable. The frequency needs to be stable at 50 Hz in Europe and 60 Hz in the Americas. “Conventional generators also have electromagnetic inertia which means they also support voltage. Voltage can be thought of as the electrical pressure that pushes current through the network. If voltage rises too high or falls too low equipment can be damaged.”
Wind and solar generators produce direct current which is converted to alternating current using electronic devices known as inverters. These inverters are “grid following” i.e. they cannot create the current and voltage wave. There are some efforts to develop grid forming inverters that would do this but there are big challenges in their development and so far, there are no such devices in operation anywhere in the world where they are actually forming the grid. Porter confirmed that the blackout in Spain in April 2025 was caused by a lack of inertia due to too much wind and solar power. Porter wrote “As inverter-based generation increases, frequency and voltage oscillations are becoming more common, and the grid becomes weaker.” Porter also discusses energy security related to oil and gas.
CliSci # 447 2026-03-07
DoE-DeepDive: Tornadoes, Flooding and Droughts, oh my!
The Climate Discussion Nexus (CDN) published a summary of U.S. tornadoes, flooding and drought from the U.S. Department of Energy report about the impacts of greenhouse emissions on the U.S. climate. A weak tornado in a rural area before 1990 would likely not be observed. Now with radar coverage everywhere, all weak tornadoes are recorded. The summary says “But strong to violent tornadoes were always noticed so the count of those should indicate if there are trends.” A chart of U.S strong to violent tornado counts shows a trend line from a count of 60 in 1950 to 27 in 2024, or a decline of 4.4 tornadoes per decade. Another graph of U.S. weak tornado counts shows a downward tend from 1990 to 2024, but it shows a strong upward trend of weak tornadoes from 1950 to 1989. CDN wrote “Once the radar systems were in place the apparent trend disappears, strongly suggesting there were as many tornadoes prior to 1990 but for the reasons listed above, they just weren’t reported.”
For floods in the U.S., some areas show increasing and some areas show decreasing floods. There is no U.S. wide trend in floods. The IPCC says there is “low confidence” of any trend on a global scale. Droughts in the U.S were the most severe in the 1930s in the historical record. The paleoclimate record indicates far worst droughts in previous centuries. “The percentage of the US classified each month as “Very Dry” has declined over 1895 to 2025, according to NOAA.”
Stefani on the Sun vs. CO2 as Climate Drivers
Andy May wrote this review of a new paper by Frank Stefani that compares the solar “aa” index andCO2 emissions to global sea surface temperatures (SST). The solar “aa” index is a measure of the solar-geomagnetic coupling. It has been measured since 1868. The index varies with changing sunspots and the solar wind. The author performs a double regression of the SST data on the geomagnetic aa index and the binary logarithm of CO2. By applying a regression method, weighting factors were applied to the solar index and the CO2 forcing to best fit the SST data. Stefani says that the solar aa index can simulate the SST from 1850 to 1990. After 1990 the role of CO2 in the regression analysis increases significantly.
Stefani then calculated the transient climate response (TCR) of CO2 subtracting the solar aa contribution from the SST and then regressing the remaining temperature on the CO2 data. Stefani then compares his result to several other studies. The TCR is defined as the increase in temperature resulting from a 1% per year increase in CO2 up to a doubling of the CO2 concentration. However, the actual rate of increase in CO2 over the last 50 years was about 0.55% annually. Stefani made no attempt to correct for the difference between the actual CO2 increase and the 1%/y in the TCR definition. He said the value he calculated is a “type of TCR”. OK, but it can’t be compared to other papers that use the usual TCR definition!
‘Internal Noise’ And Volcanic Forcing Can Trigger 10-15°C Greenland Warming Within Decades
During the last glacial period from about 115,000 to 11,700 years ago Earth’s climate had abrupt shifts between cold and relatively warm phases, each lasting hundreds to thousands of years. The rapid transitions from cold to warm conditions are known as Dansgaard-Oeschger (DO) warming events. DO events shouldn’t be confused with interglacial events. Ice-core analyses from Greenland reveal that DO events involved dramatic temperature increases of 10 °C to 15 °C occurring within just a decade or two. This paper uses computer models with realistic volcanic forcing to investigate how volcanoes can trigger DO events. The paper’s abstract says “These simulations are constrained by sulfate records from ice cores, which help estimate the timing of past major eruptions. We investigate how volcanic eruptions may have occasionally triggered abrupt climate change during the last glacial period. Our results show that very large equatorial eruptions can induce large changes in the Atlantic Meridional Overturning Circulation via atmospheric and ocean circulation changes and air-sea buoyancy fluxes, potentially pushing the climate system between persistent warm and cold states lasting millennia.” The simulations also show that unforced natural climate variability can modulate the likelihood of a transition occurring under volcanic forcing. h/t notrickszone.com
Is a 1.1 °C Rise in a Century Unusual?
The global average temperature rise was about 1.1 °C in the post-industrial period. Many people claim that this constitutes a climate emergency. This paper set out to answer the question “Is a 1.1 °C rise in a century an unusual event in a recorded at a single location in a consistent manner over a long period?” If this temperature rise is unusual, then it is reasonable to determine the cause of the rise. If the cause is determined to be mostly human-cause, then it would be reasonable to determine the cost and benefits of reducing the cause. The paper says “On the other hand, if it is not unusual, then we must question the relevance of seeking attribution.” The author used the temperature record derived from the Epica-Vostok Ice core dataset to answer this question. EPICA (European Project for Ice Coring in Antarctica) is a multi-national project for deep ice core drilling in Antarctica.
The paper’s abstract says “The answer is surprising. By considering interglacial onsets and decays as well as intermediating Ice Ages, it turns out that a rise of this amount would have been considered unusual more than 200,000 years ago, but this rise is not unusual in the current interglacial which started some 20,000 years ago with around 16% of all centuries since the last Ice Age exhibiting a temperature rise of at least 1.1 °C. None of these could have anthropogenic components as they pre-dated the industrial era. This result suggests that attempts to partition the current rise into anthropogenic and nonanthropogenic components are questionable given that it is not even unusual.” The author wrote “In other words, a rise of 1.1 °C in the current interglacial is not significant at any accepted level, (normally taken as < 0.05).
Junk Science: Grasslands Could Shrink by Half by 2100 Due to Warming
This article by Anthony Watts shows an Earth.org article that claims grasslands could shrink by half by 2100 as climate change intensifies, is nonsense. Growing solar farms might threaten grasslands, but CO2 emissions will not! The Earth.org article is based on the very alarmist Potsdam Institute’s modeling study. That study projects a contraction of what it calls “safe climatic space” for grazing. Watts wrote, “The study defines narrow thresholds for temperature, rainfall, humidity, and wind speed and then projects that future warming will push large regions outside those bounds. That is not measurement, it is model-driven extrapolation layered onto emissions scenarios extending to 2100. No global dataset is presented showing that grasslands have begun to shrink by anything close to these figures under the roughly 1.2°C of warming experienced since the late nineteenth century.” There is no evidence in the satellite data that warming is causing any negative effects on grasslands. In fact, NASA satellite data shows the elevated atmospheric CO₂ has enhanced photosynthesis and improved plant water-use efficiency, particularly in semi-arid regions where many grasslands are located. Laboratory and field studies indicate that the CO2 fertilization effect generally increases the growth rate of grasses by 20% to 100%, depending heavily on the grass type, water availability, and nitrogen in the soil. Despite the relevance of the CO2 fertilization effect, the Earth.org article does not mention it. Watts wrote “Data from satellite measurements indicates that the globe has increased its green area about 5 percent over the first 20 years of the twenty-first century. The Sahara Desert is becoming smaller as a result.” Grasslands have persisted through warmer and cooler periods throughout the Holocene. Different grasses are adapted to various temperature ranges. If warming does cause a grass type to decline it would be replaced by other grasses more suitable to that climate. The Potsdam study doesn’t consider this outcome.
CliSci # 446 2026-02-23
Clearing Up Some Misconceptions about the DoE Report
Dr. Ross McKitrick wrote an article to correct misconceptions that are found in the media about a report on the impacts of greenhouse gas emissions on the climate of the United States. The DoE climate report was requested by the U.S Energy Secretary and a draft report was published in July 2015. The report was written by Ross McKitrick, Judy Curry, John Christy, Steve Koonin and Roy Spencer; the “climate group”. Two environmental groups sued the Department of Energy alleging the climate group wasn’t legally constituted. A court judge ruled against the DOE and ordered the work suspended.
It is alleged that the climate group was secretive. However, comments on the DoE climate report and the responses would be published. McKitrick said the process “was far more transparent than either the IPCC or academic journals.” Some say the report is “attacking climate science.” McKitrick countered “We aimed to explain important topics and lines of evidence that have typically been downplayed in public discussions, in other words to broaden the scientific discussion, not attack it.” It was alleged that the report attacks the Environmental Protection Agency’s (EPA) endangerment finding (EF). Actually, the climate group was kept well away from the EF reconsideration process and they didn’t know what they were doing. Some claim that the Trump administration abandoned the DOE climate report. McKitrick wrote “No, the EPA neither accepted nor rejected it because they concluded they lacked statutory authority to do either. The rescission of the EF was based on recent court rulings that limit U.S. Agency powers to regulate in areas not specified in legislation. The EPA concluded they lacked regulatory authority over greenhouse gases, so neither can they issue findings on climate science.”
Fire Dynamics and Temperature Reconstruction at New Brunswick
Scientists extracted a core sample from a lake near Fredericton, New Brunswick, Canada to investigate the fire dynamics and temperatures over the last millennium. A high-resolution pollen analysis was performed on the core. The core bottom dated to AD 890. The temperature reconstruction clearly showed the Medieval Warm Period (MWP: AD 900-1400) and the Little Ice Age (LIA: 1400-1850). The MWP had an average spring temperature of 3.2 °C and the LIA had an average spring temperature of 2.2 °C. The temperature difference between the MWP high at ~950 and the LIA low at ~1800 was ~1.2 °C. Strangely, the core data didn’t show a significant temperature rise from 1850 to now. However, the HadCRUT5 data set shows a 2.2 °C temperature rise from 1850 to 2022. From 1979 to 2022, the satellite dataset UAH6 trend is only 53% of the land surface HadCRUT5 temperature trend, likely due to the urban heat island effect on the surface data. The authors warn “A major issue in doing climate reconstructions based upon modern pollen-climate calibration sets arises because human-caused landscape disturbance is so severe in much of the world that vegetation no longer reflects climate in the same way that it used to prior to this severe anthropogenic disturbance.”
Charcoal analysis showed that natural forest fires had a continuous presence over the past millennium, with higher frequency during the LIA than in the MWP. The core data suggest that the MWP was warmer and drier than the LIA with a high proportion of hardwoods. The authors suggest that the hardwood-heavy composition of the Acadian Forest during the MWP limited the conditions conducive to fire, as deciduous trees are generally less combustible than conifers on account of the greater moisture content of their leaves.
CO2 Caused the Boreal Forests to Grow 12% Since 1985
The boreal forest is the world's largest land-based biome, forming a massive, circumpolar ring across the Northern Hemisphere (Canada, Russia, Scandinavia) just south of the Arctic. This study used Landsat, Earth's longest-running record of global, high-resolution (30 m) satellite imagery, to study how the boreal forest has changed from 1985 to 2020. The paper says “This pan-boreal time series was then subjected to trend analysis to estimate and map the historical direction, rate, and significance of change across the region, and the resulting estimates of forest age were used to infer impacts on the region's carbon budget.”
The study reported that the boreal tree cover expanded from 7.15 million km2 in 1985 to 8.00 million km2 in 2020 which is at 12% increase. The mean latitude of tree cover increased by 0.29°. Net gains from 1985 to 2020 occurred at all latitudes above 53° N, with the strongest increases concentrated between 64 and 68° N. In addition to area growth, the dataset revealed widespread increases in tree-cover density. Younger forests are becoming more common. Because of how they grow, they are much better at quickly soaking up carbon dioxide (CO2). Young forests already contribute significantly to the region's carbon sink. Both climate warming and CO2 fertilization are expected to continue enhancing productivity. Recent evidence shows that species diversification is strongest near the tundra margin as temperate species colonize newly viable habitat. The enhance number of species may improve ecological resilience.
New Climate Models Sheds Light on Long-standing Pacific Puzzle
Climate models used in the last IPCC report did a horrible job of simulating the temperature trends in the eastern tropical Pacific Ocean and the Pacific sector of the Southern Ocean. Climate scientists have long been puzzled by the large discrepancy between observed cooling trends in these ocean regions and climate model simulations. The climate models show warming but the measurement show cooling over the last 45 years. Researcher at the Max Planck Institute for Meteorology used a new climate model with unprecedented resolution of 5 km in the oceans and 10 km in the atmosphere. The high resolution allows the model to better simulate basic processes and they have successfully reproduced the observed sea surface temperature patterns. The paper says “Mesoscale ocean eddies, a few tens of kilometers in size, are ubiquitous in the Southern Ocean and play a key role in poleward heat transport, but they are not represented by coarser-resolution CMIP models.” The “eddies move heat poleward across the Antarctic Circumpolar Current (ACC). … The simulation shows that, as the Southern Ocean is exposed to a warming atmosphere, poleward heat transport by eddies across the ACC weakens. At the same time, excess heat supplied by the atmosphere is promptly transported away by the ACC to other basins. Ultimately, this dynamical interplay cools the top 2,000 m of water in the Pacific sector of the Southern Ocean and causes the ACC to shift northwards, thus expanding the ocean area covered by polar waters. Cooling is communicated to the subtropical Pacific through its connection via both oceanic and atmospheric pathways from the Southern Ocean. This strengthens the already existing high-pressure anomaly off the South American coast. As a result, southeasterly trade winds blowing from there towards the equator intensify; they cool the sea surface through evaporation and create low stratocumulus clouds that reflect incoming solar radiation, contributing to further cooling.”
CliSci # 445 2026-02-04
Historical CO₂ Levels in Periods of Global Greening
Atmospheric increase in carbon dioxide (CO2) has caused a 30% increase of terrestrial Gross Primary Production (GPP) since 1900, otherwise called global greening. This paper raises an important question; Is an increased CO₂ level also a necessary condition for a large increase of GPP? The paper’s abstract says ”This paper evaluates whether CO₂ levels during historical periods of similar or more greenness as today, are consistent with the widely held view that CO₂ levels remained below 300 ppm over the past 800,000 years, as indicated by Antarctic ice core records. This research uses eight long-term GPP datasets to model the global GPP response to increasing CO₂ levels. The paper shows “a diminishing return of increasing vegetation growth associated with rising CO₂ levels, as additional factors such as nutrient and water availability impose constraints on the fertilization effect.” It also shows that the average residence time of CO₂ in the atmosphere increases with higher GPP values. The CO₂ residence time is equal to the total atmospheric CO₂ mass divided by the global down flux per year. The research leads to the conclusion that high CO₂ levels in the past, similar to today’s CO₂ levels, were necessary for comparable GPP during green periods like 10,000 years ago. Around 10,000 years ago, forest cover was 50% greater than today. The global terrestrial GPP was 4.4% larger in that period than today. The paper says “A CO₂ concentration of 280 ppm would only be possible if nature’s response to CO₂ were fundamentally different from what we observe today, with other constraining factors exceptionally more favorable. Natural fluctuations of the atmospheric CO₂ concentration can be well explained, based on the strong temperature dependence of the degeneration of carbon compounds that are stored in large quantities in the soil and the oceans.”
The paper’s conclusion states “The global GPP is by far the most important component of the down flux to land and oceans. As the CO₂ concentration is proportional to the down flux and the residence time, a ‘green Earth’ is inextricably linked to high atmospheric CO₂ concentrations. The current level of Earth's greenness is not extraordinary, suggesting that the present atmospheric CO₂ concentration is also not exceptional.” It is “unlikely that historical CO₂ levels were as low as generally accepted. … This conclusion contradicts the assumed low CO₂ concentrations in the past 800,000 years, based on the ice core records from Antarctica. … Several studies have raised serious questions regarding the accuracy and reliability of ice core data, especially with respect to the dissolvement of CO₂ in melting water the many years before the air bubbles in the ice are fully closed.”
A Limit of Wind and Solar Power in Belgian Electricity Grids
This article by Dr. Pierre Kunsch, translated from French, describes the problem of allowing too much intermittent renewable energy on the European electricity grids and in Belgium in particular. The article states “The blackout of April 28, 2025, in Spain and Portugal demonstrated that there is a limit that must not be exceeded for their installation on transmission and distribution networks.” Wind and solar power are too irregular to be viable replacements for domestic thermal power plants. The article demonstrates that the proportion of total electrical energy consumption from the electrical grid should not exceed 10%. The Belgian electricity mix in 2021, before the nuclear phased-out began, with that of 2024, after the shutdown of two reactors. The nuclear reactors produced up to 6 GW of dispatchable power in 2021 and 4 GW in 2024. Wind and solar power were given priority access to the grid. The article shows graphs of electricity production by power source over 2021 and 2024. There are numerous gaps in the production histories which indicate insufficient production from domestic sources. Load curve graphs show the hourly electrical energy data in descending order over the hours of the two years. The consumption at the top of each graph is compared with the total hourly production of the stacked dispatchable sources and the stacked non-dispatchable renewable energy sources.
Between 2021 and 2024, total consumption decreased slightly from 85 to 81 TWh/yr. Domestic dispatchable sources contributed 91% in 2021 and only 63% in 2024. The imports increased dramatically from 1% to 14% of total consumption while exports fell from 11% to 2%. The peak power consumption is almost the sum of the maximum dispatchable capacities and imports. The contribution of non-dispatchable capacities to cover the power peak is very small in both years. In the absence of sufficient dispatchable capacity, Belgium must import dispatchable power from its neighbours. The article’s conclusion says “The significant intermittent capacity did not replace them; it was simply added on top of them without any substitution. Thus, capacity between 13 and 14 GW – originally entirely dispatchable before the introduction of intermittent sources – was almost doubled in 2024, without any increase in domestic production. … Domestic production, which remained above consumption until 2021, fell by 23% and is now below consumption, which itself fell by 'only' 4%. … The CO2 emission intensity has decreased only marginally by 3%.” Intermittent capacity is not replacing the lost dispatchable capacity. Its significant growth has created a new risk: increased dependence on imports. All base load power consumption should be met solely by dispatchable sources. For peak demand flexible energy sources are needed, primarily gas turbines without a steam cycle. The article further shows that intermittent sources should be limited to 9% of total annual power consumption.
Dutch Climate Skeptics Vindicated: KNMI reinstates Seven pre-1950 Heatwaves
The Royal Netherlands Meteorological Institute (KNMI), the national weather and climate institute in The Netherlands, admitted publicly that a group of four skeptical scientist were right in their critique of KNMI’s temperature adjustment. Seven years after Dutch skeptics first challenged KNMI’s temperature adjustments, the institute has reinstated seven “lost” pre-1950 heatwaves, validating claims of over-correction that had erased 16 out of 23 historical temperature extremes. This article by Marcel Crok of CLINTEL explains that “in 2016, KNMI homogenized their daily temperatures for the period 1901-1950 because of a change in measurement method in 1950 and a displacement 300 meter towards open field in 1951.” The temperature adjustment involved comparing a station at De Bilt with a station 150 km northeast of it. Crok wrote “The hottest days of the year in the period 1901-1950 were corrected downwards by up to 1.9oC. Because of this, 16 out of those 23 heatwaves vanished from the official records.” KNMI then started to claim the “that heatwaves nowadays are much more frequent than in the past.” The four climate skeptics, including Crok, wrote a report about the adjustments that showed the KNMI has over-corrected far too much. Two KNMI representatives told Crok that they wouldn’t response to the extensive report as they didn’t trust him. In December 2021, the skeptics published at paper in the journal Theoretical and Applied Climatology. Crok wrote “In the paper, we presented irrefutable evidence that, as a result of the corrections, De Bilt had become an outlier compared to the four other principal stations we have in the Netherlands.” In response to a press release about the paper, a newspaper asked for a reaction from KNMI. They promised to look into it in 2022. In October 2025, KNMI asked the skeptics to review their new homogenization, which recognized the validity of their criticism. Crok wrote that a Dutch climate journalist “interviewed both me and KNMI and his headline was simply amazing: KNMI ‘Discovers’ Seven Pre-1950 Heatwaves: A Win and Point of Principle for Climate Skeptics.”
New Manual on Scientific Evidence: Smart People Get Hoodwinked by the Climate Charlatans
The US Federal Judicial Center (FJC) published a document for the US Courts titled “Reference Manual on Scientific Evidence”. The purpose is to help judges who are not trained in science decide cases involving complex scientific evidence. Francis Menton wrote “In this latest version of the Reference Manual, the FJC has totally lost its way. Somehow, it got captured by a clique of climate charlatans who have inserted a lengthy section that is anti-science and based on logical fallacy. And many dozens of seemingly smart people who were supposedly reviewing this have gotten hoodwinked.” One sub-section of the manual titled “Science Investigates Testable Hypotheses” describes the scientific method. It says that a testable hypothesis must generate specific predictions which needs to be compared to a set of observations. If the prediction is inconsistent with observations, then the hypothesis is false. “If an explanation is equally compatible with all possible observations, then it is not testable and hence, not within the reach of science.”
A sub-section of the manual is called “Climate Change Detection, Attribution, and Projections.” Activist climate scientist wrote this section to support litigation seeking to blame weather-related disasters on emitters of CO2. They don’t want storms events to be tested against real observations. Instead, they wrote “[A]ttribution involves sifting through a range of possible causative factors to determine the role of one or more drivers with respect to the detected change. This is typically accomplished by using physical understanding, as well as climate models and/or statistical analysis, to compare how the variable responds when certain drivers are changed or eliminated entirely.” Menton wrote “How about articulating a testable hypothesis and testing it? They will never, ever, ever do that. It could prove the whole enterprise to be wrong!”
Menton wrote in a second post than instead of using real world evidence, “these studies claim to validate their attributions by reference to things like “physical understanding” and models that have not been empirically validated. In other words, rather than using empirical evidence to validate a hypothesis, they use one hypothesis supposedly to validate another hypothesis. They have assumed the conclusion they want to reach. This process is sometimes called circular reasoning.”
New Study Affirms Rising CO2’s Greening Impact Across India
This new study reports that India is the second largest contributor to global greening, and that CO2 Fertilization Effect (CFE) has driven a “substantial expansion of global green cover over the last two decades.” This article says “The authors found the CFE has ‘nearly doubled’ the trend values in net primary production (NPP) across India relative to the trend values when the CFE is not considered.” Plants absorb by photosynthesis about 30% of human-caused CO2 emissions, thereby mitigating climate change. Gross Primary Productivity (GPP), which quantifies the amount of CO2 absorbed during photosynthesis, is a key indicator of land carbon uptake. GPP represents the total energy captured by plants, while NPP is the remaining energy stored as biomass available to the rest of the ecosystem after deducting the energy that is immediately consumed by the plant for living. The study utilizes the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data to determine trends of NPP in India. The MODIS data indicates the leaf area but does not consider the direct CFE. The study incorporates the direct effect of CFE into MODIS vegetation productivity estimates and reassesses NPP trends across India from 2001 to 2024. The analysis shows that the NPP trend nearly doubled after accounting for the CFE. The southern peninsular India shows only modest NNP trends in response to the CFE likely due to warming. In contrast, NW India where temperature have declined shows a stronger increase in vegetation productivity.
CliSci # 444 2026-01-19
The Epic Climate Model Failure Continues
Climate models forecast that the strongest warming response of the climate system to increasing greenhouse gases is in the tropical upper troposphere, which is known as the “tropical hotspot”. Dr. John Christy has updated his graph of the tropical tropospheric temperature trends comparing models versus the measurements as shown on Roy Spencer’s blog. The graph shows the trends from 1979 to 2025 from 39 climate models and measurements from radiosondes (weather balloons), satellites and reanalysis. The radiosonde coverage of the tropics is very sparse. Only the satellites provide complete coverage of the tropics.

Roy Spencer wrote “Amazingly, all 39 climate models exhibit larger warming trends than all three classes of observational data.” The two Canadian climate models, CanESM5 and Can5-OE, produce trends of 0.505 and 0.453 °C/decade, respectively. The three observation types, sonde, reanalysis and satellite, give trends of 0.192, 0.185 and 0.155 °C/decade, respectively. The average trend of the 39 models is 0.317 °C/decade and the average trend of the measurements is 0.177 °C/decade. The Canadian model CanESM5 produces a trend that is 2.85 times the average observations. The climate model average trend is 1.78 times the average observations. Spencer explains why the climate models give excessive warming trends. He wrote “The excessive warming of the tropical troposphere is no doubt related to inadequacies in how the models handle convective overturning in the tropics, that is, organized thunderstorm activity that transports heat from the surface upward. That ‘deep moist convection’ redistributes not only heat energy, but clouds and water vapor, both of which have profound impacts on tropical tropospheric temperature.”
Spencer also presented a time series graph showing the 30-model average trend and the trends of the three observation types. The unusually warm year of 2024 really stands out but in 2025 the measurements return to their trends.
Offshore Wind Turbines Steal Each Other’s Wind: Yields Greatly Overestimated
A new study using an analytical model of wind farms shows that national policy targets assume up to 50% more energy production than what can be achieved. The model was validated by energy data from 72 windfarms. The study says “Such overestimation not only hides true energy costs but also underestimates power variability, integration, and curtailment risks, and it distorts policy pathways. When projections exceed physical limits by such margins, the resulting electricity shortfall can destabilize decarbonization strategies and reach deep into society and the economy.” The paper’s summary says “The model clarifies the critical design trade-offs between turbine height, specific power, and wind farm density. Our model provides a rigorous yet simple framework, readily usable by engineers, planners, and policymakers to forecast wind farm performance, support system planning, and to set realistic targets consistent with aerodynamic limits.” The paper’s conclusion says “This study establishes a physically grounded upper limit on wind farm performance, demonstrating that aerodynamic constraints impose a fundamental ceiling on the energy extractable from the marine atmospheric boundary layer.”
The study is discussed in this article by Bert Weteringe, an expert on wind turbines. "Wind turbines literally steal each other’s wind, which means that the efficiency of wind turbines will decrease even further as their number increases. In seven of the nine case studies, the national policy targets for offshore wind yields turned out to be way overestimated." The resulting shortfall in electricity revenues "could have a profound impact on society and the economy."
Save LBI Says Offshore Wind Projects Grossly Underestimate Harm to Marine Mammals
Save Long Beach Island, Inc. (Save LBI) is an organization dedicated to preserving the shore and ocean environment. It contends that the high levels of noise generated during the surveying, construction, and operation phases of an offshore wind project have a detrimental effect on sea mammals. This article says “Because hearing acuity plays a central role in the ability of a whale or dolphin to navigate its surroundings — especially during migration — elevated noise can cause serious harm, including impaired or permanent hearing loss, and behavioral disturbances that can also lead indirectly to harm and death.” For years, NOAA has ignored research presented by Save LBI. The organization has recently petitioned NOAA to overhaul the flawed methodology being used for calculating “Takes” — instances of fatality, serious harm, or behavioral disturbance to whales and other marine mammals. The petition cites major scientific and mathematical errors in NOAA approved “Take estimation” methods and presents calculations that show significant harm and disturbance to whales and other mammals in the vicinity of offshore wind projects. Save LBI has also asked NOAA to require a monitoring program that will either verify the calculations in its Take estimation method or revise those methods. The program should provide data to verify noise reduction assumed from using bubble curtains around the pile driving operations, as well other noise-related parameters. Save LBI President Bob Stern said “We ask them to put forward a monitoring program in conjunction with that review that will either verify or disprove the concerns we have raised. If our concerns are verified, then NOAA should define and require the use of new mathematically and scientifically defensible methods for calculating marine mammal Takes.”
Climate Alarmism’s Credibility Sinks Under Weight of Ecological Evidence
This article by Vijay Jayaraj shows that the climate alarmism and the climate emergency, which are built on flawed computer models, are collapsing under the weight of the reality of good environmental news. Jayaraj wrote “For decades, activists have anchored their case in dramatic warnings about species extinction, melting ice caps and the end of polar life, failing ecosystems and vanishing biodiversity. The goal was to spread fear. The world’s largest nations have actually expanded their forest area significantly. Between 2015 and 2025, China added approximately 4 million acres of forest while Russia gained more than 2 million acres and India gained nearly a half million acres of forest. Polar bear populations have not declined over the last 15 years despite dire predictions. Between 2014 and 2022, the number of India’s tigers grew from 2,226 animals to 3,682. This represents a 65% increase over eight years, with an annual growth rate exceeding 6%. species extinction rates have not accelerated. Instead, they peaked over a century ago and have been in decline since the early 1900s. The great die-off turned out to be a phantom. Past extinctions were largely driven by invasive species on isolated islands, not by habitat loss or the alleged ‘climate crisis’. Famine failed to materialize as farmers across the globe brought in record harvests. Crop yields have increased substantially, enabling farms to feed more people while using less land. The new media is silent about these victories. These stories are buried because they don’t sell fear.
CliSci # 443 2026-01-03
2025: Lowest Global Death Rates from Extreme Weather in History
“Globally, 2025 has had one of the lowest annual death rates from disasters associated with extreme weather events in recorded history.” This good news was reported by Roger Pielke Jr. using data from the Centre for Research on the Epidemiology of Disasters (CRED) in Belgium. The CRED data indicated about 4,500 global deaths related to extreme weather event through October 2025. Reports estimate that about 1,600 people lost their lived in the final two months of 2025, which was mostly due to flooding in south Asia associated with Cyclones Senyar and Ditwah. If those estimate prove accurate, the total 2025 weather-related deaths would be 6,100. This number is similar to some recent years, but the death rate from extreme weather events is the lowest ever at less than 0.8 deaths per 100,000 people (with population data from the United Nations).
Pielke wrote, “To put the death rate into perspective, consider that:
- in 1960 it was >320 per 100,000;
- in 1970, >80 per 100,000;
- in 1980, ~3 per 100,000;
- in 1990, ~1.3 per 100,000;
Since 2000, six years have occurred with <1.0 deaths per 100,000 people, all since 2014. From 1970 to 2025 the death rate dropped by two orders of magnitude. This is an incredible story of human ingenuity and progress.” Extreme temperature event impacts (cold and hot) are not included here.
The declining death rate from weather events is due to reduced vulnerability and improved preparation for extreme events. Pielke wrote “Underlying this trend lies the successful application of science, technology, and policy in a world that has grown much wealthier and thus far better equipped to protect people when, inevitably, extreme events do occur.”
The Record Hot UK Summer of 2025: Validation of the UKMO Methodology, but the Record Was Only in Tmin
The UK Met Office (UKMO) reported that 2025 was the UK’s warmest year on record with an average temperature of 10.09 °C. However, the UKMO’s methodology of averaging the temperature record from surface temperature stations has been criticized. Many temperature stations do not meet the World Meteorological Organization criteria for a good climate monitoring station and many UK stations have closed in recent years, and so those stations’ temperatures are estimated from surrounding stations. Temperature monitoring stations come and go over time, and this can introduce biases that change over time and corrupt long-term estimates of temperature change. Dr. Roy Spencer wrote “How one accounts for, and adjusts for, these changes is not a settled matter.” The UKMO evaluates how station temperatures vary with latitude, longitude, elevation, proximity to the coast and land use to remove relative biases between stations. These adjustments are applied to stations surrounding closed stations which are then averaged to estimate temperatures at closed stations. This technique is intended to avoid the biases caused by closing stations.
Spencer tested the UKMO methodology by comparing its results to a his “relative bias removal method”. He starts with the 3 stations having the longest continuous records in all 126 years from 1900 to 2025 in the UK and averages them. Spencer explains “Then, I take the station(s) with the next-longest record (Oxford, 124 years), compute the average difference with the original series, and add it to the series to make a new 4-station average. This is done sequentially for all (148) stations in the UK since 1900 that have at least 2 years of record, going down the list from the longest periods of record to the shortest.” This method produces yearly summer-average temperatures that are nearly identical to the UKMO methodology. Spencer concluded “In both my and the UKMO analysis, 1976 (not 2025) was the hottest summer in daily high temperatures (Tmax), with 2025 taking 3rd or 4th place; the “record” hot year of 2025 was due to nightly low temperatures (Tmin) being anomalously warm. The average of the three hottest daytime temperatures in each summer month put the summer of 2025 in 4th place since 1960, behind 1976, 1995, and 2022 (which were essentially identical).” Neither Spencer’s nor the UKMO method accounts for an increasing urban heat island effect nor changes in instrumentation types that could cause spurious warming in the record. Increases in nightly low temperatures are beneficial as the number of cold-related deaths are around 50 times that of heat-related death in the UK.
Hunga Tonga Volcano Likely Caused the 2023 Exceptional Warming
An eruption began on the submarine volcano Hunga Tonga–Hunga Haʻapai on 20 December 2021 and reach a very large climax explosion on 15 January 2022. The eruption likely cause the exceptional climatic warming event of 2023 according to an article by Javier Vinós. The author wrote “It is clearly a naturally occurring, externally forced climate event. However, mainstream climate scientists are not treating it appropriately.” The 2023 warming resulted in 2023 and 2024 being the warmest years in the global surface temperature record. The biggest global low cloud cover anomaly ever recorded occurred in 2023. The climate models can’t explain the extraordinary 2023 ocean warming. The caption to this plot says “The 2023 climate event can be seen most clearly in the global sea surface temperature anomaly (NOAA, 60°N–60°S, baseline 2021). It began in December 2022. By November 2025, 90% of the warming from the 2023 event has disappeared. A 2024 study found only a 0.2 % probability that the 2023 warming could be due to the unforced internal variability and the forced greenhouse gas-induced trend. Vinós explains that the El Niño in 2023 cannot be held responsible for the warming as it was too weak. The anthropogenic forcing by greenhouse gases and urban warming is small and constant, and can only produce noticeable changes over long periods of time. Maritime fuel regulations that came into force in 2020 caused an abrupt reduction in sulfur emission, but this couldn’t be a major cause of the 2023 event because its warming effect is permanent, whereas the ocean warming was reversed in 2024 and 2025.
The 150 megatonnes of water vapour that the Hunga Tonga underwater volcano injected into the stratosphere are unprecedented. Vinós wrote “Climate models do not adequately reproduce the effects of the 1815 Tambora eruption, suggesting that dynamic atmospheric changes caused by stratospheric eruptions or other factors have a much greater impact on climate than previously thought.” Therefore, the climate models can’t directly be used to explain the effects to the Hunga Tonga eruption. The 2023 warming was directly caused by the drastic decrease in global cloud cover. However, scientist still do not know what controls changes in cloud cover. Wind speed has a greater impact on evaporation than temperature or humidity. A change in wind caused by the Hunga Tonga eruption might cause changes in evaporation, cloud formation and their distribution. If the Hunga Tonga eruption caused the 2023 warming, we should observe most of this warming disappear in 3-5 years. This projection does not arise from any of the other considered causes. By December 2025, four years after the eruption, this prediction had come true. 90% of the ocean warming from the 2023 climate event has disappeared. The Hunga Tonga eruption is currently the best explanation for the 2023 climate event.
Climate Models Underestimate the Cooling Effects of Southern Storms
A new study in Nature Geoscience shows that current climate models “underestimate the strength of Southern Ocean storms and thereby simulate an overly warm ocean.” A news release about the study says “storms play a key role in controlling how the Southern Ocean exchanges heat with the atmosphere. The team finds that intense winds churn the ocean, drawing colder deep water upward and pushing warmer surface water downward. The surface stays cooler and can take up more heat from the atmosphere.” The abstract says “The Southern Ocean absorbs most of the excess heat resulting from climate change. However, climate projections show a persistent warm summer bias in its sea surface temperatures, indicating a limited understanding of the air–sea heat exchange mechanisms governing this region. … Our results demonstrate a causal link between storm forcing and sea surface temperature variability, which is critical for reducing warming biases in climate models and improving future climate projections.” The research team have been studying storm patterns around Antarctica over the last few decades and can now link changes in storm intensity and their windiness to changes in our climate and atmospheric circulation. It's in the Antarctic summer that storms have their strongest impact on ocean heat uptake.
No Hurricanes Strike USA For 1st Time in a Decade
For the first time in a decade, not a single hurricane struck the U.S. this season, and that was a much-needed break. This is according to NOAA’s administration Neil Jacobs as reported by the RigZone website. The official hurricane season for the Atlantic basin is from June 1 to November 30. A tropical storm caused damage and casualties in the Carolinas, distant hurricanes created rough ocean waters that caused property damage along the East Coast, and neighboring countries experienced direct hits from hurricanes,” Jacobs said in the statement. The NOAA statement noted that the Atlantic basin produced 13 named storms. Of these, five became hurricanes, including four major hurricanes, NOAA highlighted, pointing out that an average season has 14 named storms, seven hurricanes, and three major hurricanes. Hurricane season activity was near-normal for both the Eastern Pacific basin and Central Pacific basin and fell within predicted ranges. Linnea Lueken of Climate Realism noted that neither RigZone nor NOAA credits climate change nor do they try to attribute hurricanes to climate change. Lueken wrote about the RigZone post “The whole post is factual and straightforward. Climate change is not making hurricanes worse; it is simply not evidenced in the data.” This is good reporting for a change. Lueken wrote “Earlier in the season, Climate Realism addressed some of the false claims regarding Hurricane Melissa and Erin, particularly when media claimed that ‘rapid intensification’ was due to climate change. That was false, no real-world data backed the assertion, only misleading and corrupt attribution models that over-emphasize water temperature and downplay other factors that influence hurricane strength and formation.”