By: Ken Gregory, BASc.
CliSci # 432 2025-06-12
Roger Pielke Jr: Is there Hope for Scientific Integrity?
Roger Pielke Jr, the well-known political and climate scientist from Boulder, Colorado, gave a very interesting talk on 23 May 2025 for the Irish Climate Science Forum titled Climate Scenarios, Weather Attribution – Is There Hope for Scientific Integrity? See YouTube. Pielke discussed the challenges facing scientific integrity within climate science. He explores issues surrounding the misuse of data, flawed methodologies, and the politicization of climate research. Scientific integrity is foundational to the scientific process. Pielke structures his talk around three case studies: the misuse of the ICAT hurricane damage dataset, the problematic NOAA billion-dollar disasters dataset, and the implausible climate scenarios used by the Intergovernmental Panel on Climate Change (IPCC). Each case illustrates a failure of self-correction and raises questions about how scientific research influences policy.
The ICAT hurricane damage dataset, which originated from Chris Landsea and Pielke’s work in the 1990s. Their research showed that societal factors, such as population growth and wealth accumulation in coastal areas, drove the increase in damages, not climate trends. Later, that dataset was altered without proper documentation. The altered dataset included inflated loss estimates from NOAA’a billion-dollar disasters dataset that mixed inconsistent methodologies. Pielke’s 2023 paper in a Nature journal applied NOAA’s own scientific integrity standards and found the dataset lacking in reproducibility and consistency. The flawed ICAT and NOAA’s datasets were used in studies that were cited in the IPCC’s AR6 and the U.S. fifth national climate assessment that attributed disaster losses to climate change, ignoring a broad literature of over 60 normalized studies that didn’t attribute disaster losses to climate change. Pielke’s efforts to correct this misuse through peer-reviewed critiques and calls for retraction were met with resistance from journal editors, highlighting a failure of self-correction.
The IPCC continued use of extreme climate scenarios, particularly RCP 8.5 and its successor, SSP5-8.5. RCP 8.5, labeled as a “business-as-usual” scenario, assumes a massive increase in coal use with little natural gas consumption and a global population of 13 billion by 2100—projections Pielke deems unrealistic based on current energy trends and demographic forecasts. Real-world emissions and energy outlooks align with more moderate scenarios (2–3°C warming from 1850 to 2100), not the extreme RCP 8.5 (4–5 °C warming) used in thousands of studies.
Are Climate Model Forecasts Useful for Policy Making?
A paper presents comparisons of temperature forecasts from models relied on by the U.N.’s IPCC to forecasts from models that exclude human influence and included volcanic and one of two independent measures of solar variability. The models were used to forecast Northern Hemisphere with all land and rural only land temperatures. The authors, Dr. Kesten Green and Dr. Willie Soon, say that a statistical fit of a model’s temperature forecast to global temperatures is insufficient to conclude that the model is useful for making policy decisions. All models are tuned by adjusting parameters to make the models roughly agree with the historical temperatures. The model must also be able to forecast temperatures after that used to tune the model, more accurately and reliably than other plausible alternative models. That period is called “Out-of-sample forecasts from a model.
The study’s news release said that the authors “found models that included the IPCC’s anthropogenic (human causation) variable failed badly in temperature forecasting comparisons with models that included independent measures of variation in the Sun’s radiation. … The results were striking: Models using the IPCC anthropogenic and solar variables produced forecast errors as large as 4°C in forecasting Northern Hemisphere land temperatures that had not been used in estimating the models, and as large as 20°C in forecasting rural temperatures. The independent solar variable models’ errors were mostly much less than 1°C in forecasting the all-land temperatures, and almost always much less than 1°C in forecasting the rural temperatures.”
Dr. Green emphasized the policy implications: “Our findings suggest that IPCC modelling fails to support the hypothesis that human carbon dioxide emissions have a meaningful impact on global temperatures. Uncomfortable as it may be for policy makers, unpredictable and uncontrollable variations in radiation from the Sun and volcanic eruptions will continue to determine changes in the Earth’s climate. Policies that deny that reality cannot avoid imposing great costs on the many, to the benefit of very few”.
Direct Air Carbon Capture Comes Crashing Down: A Comedy in Subsidies
Charles Rotter wrote an amusing article about the collapse of the direct air CO2 capture (DACC) industry. Rotter wrote “The New York Times writer David Gelles practically had to mop his keyboard with tears over the news that the carbon removal market—once projected to be a trillion-dollar juggernaut—is now wheezing toward irrelevance. … 24 government grants worth $3.7 billion have been scrapped, Climeworks axed 22% of its staff, and permit applications have tanked. The ‘market’ is imploding because—brace yourself—no one wants to fund something that doesn’t work.” The media wines that it is all the fault of President Trump, saving the US taxpayer billions of dollars of useless spending. Climeworks has made international news for capturing CO2 directly from the atmosphere. The Icelandic newspaper Heimindlan reported “When Climeworks, a company based is Switzerland, opened its first capture plant in Iceland in September 2021, company officials said it could capture 4,000 tons of CO2 each year it operated. The Climeworks plant’s own emissions are 1,700 tons of CO2 in 2023. In 2023 and 2024 the plant captured 921 and 876 tonnes of CO2, which is a small fraction of its claimed capacity, and far less than its own emissions.”
Rotter said “Let’s be crystal clear: carbon capture is a thermodynamic clown show. It takes more energy to remove CO2 from the air than was released burning the fossil fuel in the first place. And even then, you still have to compress it, transport it, inject it underground, and pray it doesn’t leak back out.”
DACC requires about 5.5 MWh of electricity per tonne of CO2 captured. To capture all global energy-related CO2 emissions (37.8 GtCO2 in 2024) would require 208,000 TWh per year. This is 6.7 times the global electricity generation of 31,150 TWh in 2024.
Rotter wrote “Let’s be honest: the entire carbon capture craze was never about saving the planet. It was about:
- Making rich people feel less guilty about flying private.
- Giving bureaucrats a talking point.
- Creating a new market for companies like Microsoft to “offset” emissions without changing a single behavior.”
Unnatural 60-Second Heat Spikes Drive Many UK Met Office Temperature ‘Records’
The UK Met Office has recently introduced sensitive thermometers to replace conventional thermometers. This article published by the DailySceptic shows that the new thermometers collect 60-second unnatural heat spikes which are used to promote clickbait ‘records’ and claim exaggerated atmospheric warming. Chris Morrison wrote “Furthermore, it appears that these short-term ‘spikes’ are larger in junk sites with massive internationally recognized ‘uncertainties”. Almost eight in 10 of the Met Office’s nationwide temperature measuring stations are in junk CIMO Classes 4 and 5 with possible errors up to 2°C and 5°C respectively.” The article gives several examples of the UK Met Office using unnatural temperature spikes to promote climate alarmism. On May 1, the Met Office claimed a reading of 29.3 °C was a record high for the day. But the temperature was 0.76 °C less only a minute later. On July 19, 2022 a national record of 40.3 °C at the RAF Coningsby site was set when three typhoon jets were landing. Morrison wrote “Temperature recordings can move around from minute to minute; change, if it occurs, is generally around 0.1°C to 0.3°C. These changes do not affect old-style mercury thermometers but are picked up by the super-sensitive electronic devices used by the Met Office since the 1990s. It is for this reason that the World Meteorological Organisation (WMO) recommends averaging readings over five minutes to standardise data and remove short-term ‘noise’. For some inexplicable reason, despite playing a significant role in WMO deliberations, the UK Met Office does not appear to want to follow this sensible scientific advice.”
Checking the Government of Canada’s Climate Misinformation
Pav Penna published an article criticizing climate change misinformation in the Federal Budget of April 7, 2022 which contained the phrase: “Canada is already experiencing an increase in heat waves, wildfires, and heavy storms.” He is a researcher and a BDC consultant. Penna wrote “All three claims were demonstrably factually incorrect.” Penna’s Access to Information Act request to provide the scientific basis for phrase was denied. The reason given was “Cabinet Confidences”! Penna previously published a letter of the Office of the Information Commissioner with the subject “Improper use of ‘Cabinet confidences’ to reject a valid complaint.” The request was forwarded to the Department of Finance (FIN). FIN failed to provide evidence to support its claim. Instead, it sent a link to a speculative report about future cost to Canada’s infrastructure. FIN misinterpreted the “Canada’s Changing Climate Report” which supports Penna’s complaint about heat waves and severe storms and ignored data from the Canadian National Fire Database showing a decline in the frequency of wildfires and the total area burned.
Penna shows that Canada Weather Stats website provides data for dozens of climate characteristics for hundreds of Canadian locations in an easy to interpret graphical format. [Also Your Environment] He presents a graph of the Ottawa’s annual maximum temperatures from 1890 to 2024. It show a cluster of high readings in the early 1900’ then declining to about 1980 then little change. The overall trend is a decline of 0.94 °C/century. The trend of number of annual days over 30 °C is a decline of 2.25 days/century [graph]. Ottawa’s mean temperatures have increased at 1.23 °C/century, but this is due to increasing low temperatures, which is a characteristic of urban warming. The increasing low temperatures are completely beneficial. Penna wrote “The reality of moderating temperatures does not support an alarmist narrative or carbon taxes as a solution.”
The ‘Canada’s Changing Climate Report’ previously mentioned says “There do not appear to be detectable trends in short-duration extreme precipitation in Canada for the country as a whole based on available station data.” Penne wrote “When the Budget was published in 2022, wildfires had been declining since 1980 in both frequency and extent. Since then, the extremely high fire season in 2023 did reverse the area burned trend. Note that a historical low was set just three years earlier in 2020.”
Penne goes on to calculate the impact of Canada’s emissions on temperature using the IPCC estimate from AR6 of 0.45 °C/1000 GtCO2. He finds “The objective of Canada’s original carbon tax was to reduce emissions by 90 Mt. If achieved, this was equivalent to postponing global warming by 1°C in 24,691 years!”
CliSci # 431 2025-05-28
Consistent Climate Fingerprinting
Dr. Ross McKitrck has a new paper about climate fingerprinting that has been accepted for publication by the journal ‘Climate Dynamics’. Climate fingerprinting refers to statistical techniques used to determine the probability that observed climate characteristics, like changes in storms, temperature and precipitation, are attributed to human activities, like greenhouse gas (GHG) emissions and urbanization, and natural causes, like changes of solar and volcanic activity. It seeks to identify the unique patterns of climate change associated with different environmental factors. McKitrick wrote on his website “When I say this method is consistent, I am using the statistical concept of consistency. It is my assertion, based on several previous publications, that the IPCC method (based on Allen and Tett 1999 and Allen and Stott 2003) yields biased and inconsistent signal detection coefficients and is uninformative for the purpose of GHG attribution.” The abstract of the paper says “I propose an alternative fingerprinting method based on the Instrumental Variables procedure with consistent standard errors and I demonstrate its potential in an application to 20th century temperature data. I find the modeled anthropogenic signal is detected but needs to be scaled down by 35 to 60%, whereas the modeled natural signal needs to be scaled up 2- to 4-fold, to reconcile optimally with observations.
Dual Impact of Global Urban Overheating on Mortality
A new paper determines the beneficial and detrimental temperature-related mortality impacts of the urban heat island (UHI) effect in more than 3,000 cities worldwide. It also assesses the urban-associated cooling strategies, including planting green plants and installing reflective infrastructure. The paper’s abstract says “This study finds that the UHI effect reduces global cold-related mortality, surpassing the increase in heat-related mortality more than fourfold. Widely implemented urban cooling strategies, including green and reflective infrastructure, can have an adverse net effect in high-latitude cities but benefit a few tropical cities. We propose seasonal adjustments to roof albedo as an actionable strategy to reduce heat- and cold-related mortality.”
There was no hint in the abstract of any economic analysis on the practicality of the proposed seasonal adjustment to roof albedo.
The figure below shows the dual impacts of the UHI effect and heat mitigation strategies on cold- and heat-related mortality relative risk for a hypothetical city.
The annual net impact represents the sum of the beneficial and detrimental impacts. a, Curves depicting the relative risk (RR) variation with temperature percentiles in two scenarios: without and with the UHI effect. b, Curves depicting the RR changes with temperature percentiles in two scenarios: without (that is, the curve with the UHI effect in a) and with urban cooling strategies.
Should Carbon Dioxide Emissions Be Reduced or Allowed to Rise Steadily?
This new paper discusses the link between climate warming and CO2 levels, focusing on several uncertainties. The paper was written by FoS advisor Madhav Khandekar and Ray Garnett. Before industrial times, warming of the earth precedes an increase in CO2 concentrations. The beneficial impacts of higher levels of CO2 on world agriculture and forestry outweigh possible harmful impacts. In last ten years, there has been more and more emphasis on switching to green energy, mostly wind and solar energy, since the conventional fossil fuel energy is identified as being the major producer of CO2 causing climate warming. It is argued that a goal of Net Zero emissions by 2050 would help maintain the global mean temperatures below the 1.5 °C as stipulated in the Paris 2015 climate treaty. The net-zero agenda could cost world economies a whopping 240 trillion US dollars. Geological records reveal a definite lag of several hundred years between warming of the climate and rise in CO2 levels.
There is now a sizable body of literature showing rising CO2 levels bring a substantial increase to world agricultural production with benefits up to 750 ppm. The current CO2 level (average 2024) is 425 ppm. Global cereal yield (wheat, rice and corn) has increased by over 200% between 1961 and 2023. Many other studies show that increased CO₂ helps plants resist drought, intense heat, pollution and other environmental stresses. Satellite data show that increasing levels of CO2 are making world forestry greener and richer. In particular, the boreal forests of high-latitude regions of Canada and in Alaska show spectacular growth and have overcome deleterious impacts of forest fires in spring and summer. The authors conclude that a steady increase in worldwide CO₂ emissions would have enormous benefits to world humanity.
Oceans are Heating Faster in Two Bands Stretching around Globe
A new paper shows that the world’s ocean are heating fastest in two latitude bands, one at 40 to 45 °S and the other around 40 °N. Within the northern band, the waters east of the United States and east of Japan are warming the fastest. The lead author, Kevin Trenberth said that the heat bands have developed since 2005 in tandem with poleward shifts in the jet stream, the powerful winds above Earth's surface that blow from west to east. There was an unusual absence of warming in the subtropics near 20° latitude in both hemispheres. The warming is non-uniform due to natural variability. The ocean heat data was processed to assess 1° latitude strips of ocean to a depth of 2000 m for the period from 2000 to 2023 to produce the following world map.
Greenhouse Gases and Fossil Fuels Climate Science
A new paper by Richard Lindzen and William Happer argues that increasing CO2 concentrations from 420 ppm to 840 ppm would increase the amount of food available worldwide by roughly 40% while having little effect on temperatures. The authors are career physicists with a special expertise in radiation physics, which describes how CO2 and greenhouse gases (GHG) affect heat flow in Earth's atmosphere. They state that in their scientific opinion the Net Zero Theory, all the Net Zero Theory rules and congressional subsidies are scientifically wrong. GHGs will not cause catastrophic global warming and more extreme weather. Instead, there will be disastrous consequences for the poor, people worldwide, future generations, Americans, and other countries if CO2 and other GHGs are reduced to Net Zero and fossil fuels eliminated. Net Zero policies will endanger public health and welfare. The summary says “In summary, the blunt scientific reality requires urgent action because we are confronted with policies that destroy western economies, impoverish the working middle class, condemn billions of the world's poorest to continued poverty and increased starvation, leave our children despairing over the alleged absence of a future, and will enrich the enemies of the West who are enjoying the spectacle of our suicide march.”
The Rise of Global Sea Ice
Dr. Roger Pielke Jr. discussed the rise of global sea ice as documented in two new papers, a preprint about the pause of Arctic sea ice decline and a peer-reviewed paper about the recent rise of Antarctic sea ice. Pielke wrote “At the South Pole, Wang et al. 2025 find a record accumulation of ice on the Antarctic ice sheet over the period 2021 to 2023, following a steady decrease from 2002 to 2021. The data comes from NASA’s GRACE series of satellites, which have the ability to precisely measure ice mass. The figure below shows that the recent accumulation is small in the context of the multi-decadal decline, but is still characterized by the paper’s authors as a ‘significant reversal.’ The paper makes no predictions of whether or how long the accumulation might continue.” Writing about the far north Pielke continues “a new preprint by England et al. identifies a ‘surprising, but not unexpected multi-decadal pause in Arctic sea ice loss.’”
Pielke wrote for the New York Post “Two new studies show that the Earth’s climate is far more complex than often acknowledged, reminding us of the importance of pragmatic energy and climate policies. … Antarctic ice has made a turnaround, scientists say, with an increase in ice mass after years of depletion.” … ‘The loss of Arctic sea ice cover has undergone a pronounced slowdown over the past two decades, across all months of the year,’ the paper’s authors write. They suggest that the ‘pause’ in Arctic sea ice decline could persist for several more decades.” The rise in Antarctic sea ice mass increased by 108 gigatonnes per year between 2021 and 2023 corresponds to a global sea level decline of 0.8 mm over the 3 years.
Correction to CliSci # 430 of 2025-05-14; The European Blackout
An astute reader noticed an error in the article “The European Blackout was caused by Unstable Solar Power” of the previous CliSci #430. I wrote “For context Alberta’s average 2024 electricity load was 242 GW.” According to AESO's 2024 Annual Market Statistics Report, Alberta's average electricity load was 10.1 GW, not 242 GW.
CliSci # 430 2025-05-14
Urban Heat Island Effects in U.S. Summer Surface Temperature Data
Dr. Roy Spencer announced on his blog on October 19, 2023 that he and John Christy have finally submitted a paper to the Journal of Applied Meteorology and Climatology entitled, “Urban Heat Island Effects in U.S. Summer Surface Temperature Data, 1880-2015“. The paper was published April 29, 2025, after more than 1.5 years of review. The paper describes a new method of estimating a major part of the urban heat island effect (UHIE). The method quantifies the sensitivity of the land station raw temperature data in the contiguous 48 states (ConUS) to population density using closely space station pair daily average temperature differences during the summer. The results are grouped into four categories of station population density. The UHIE portion of the measured warming trends ranges from 8% for rural to 65% for urban categories. The ConUS UHIE warming from 1895 to 2023 is 22% of the observed raw warming trend of 0.072 °C/decade. The UHIE trend is 0.016 °C/decade. They also find the UHIE-adjusted temperature trend of 0.073 °C/decade is even greater than the raw temperature trend, meaning that instead of removing the UHIE, the adjustments increase the UHIE.
The displayed graph (figure 5 of the paper) shows the cumulated UHIE (TUHI) warming for four population density classes and all stations.
Urban Warming in Four Population Density Ranges
The data shows (see figure 1a) that the UHI warming is best represented as a Power Law Plot, where this type of plot can represent up to 99% of the data. In a log-log plot, power laws appear as straight lines, with the slope of the line representing the power law exponent.
Table 1 of the paper shows that the lowest class of population density (PD) has an average PD of 1.2% of an urbanized class (5 in the table) but has a UHI change per PD change rate of 1800 time greater.
the caption to figure 1 in the paper includes “We see that TUHI warming was stronger in the early years than in later years. The total TUHI warming between 1880 and 2023 ranges from 0.14 C° for the least populated class to 1.4 °C for the most densely populated class of station, and 0.29 °C averaged across all GHCN stations.”
The average trend 1880-2023 is 0.020 °C/decade. The UHI warming trend for the period 1980 to 2023 is 0.010 °C/decade, about half of the trend from 1880.
Importantly, the USA temperature monitory network is considered one of the best in the World, so we can certainly expect the World UHIE will be much greater than that of the ConUS. The evaluation does not account for increasing energy use or increasing pavement and cement ground cover. If a temperature station associated with a town that had a constant population over the last 20 years, the evaluation would show no UHIE despite these obvious sources of urban warming.
The UHIE evaluation by Ross McKitrick and Patrick Michaels (M&M) summarized here, states “We can use the statistical model to estimate what the observed temperature trends would have been if [every country] had as good circumstances for monitoring climate as the US does.” The authors estimate that if the USA had no UHIE, the global land area post-adjusted temperature trend would fall from 0.30 to 0.17 °C/decade, or by about 43%. The full M&M paper states that for various reasons, “our findings should be viewed as a lower bound, or conservative estimate of the magnitude of the global data contamination.” If the M&M UHIE of the ConUS is added to the new paper’s results, the revised global land trend from 1980 could increase from 43% to 48% of the adjusted temperature trend. The M&M paper can’t be directly compared to the new paper as the first one is compared to the adjusted data and the second one is compared to the raw data. The study used most of the about 12,000 stations from the GHCN, and 215,679 station pairs.
A Critique of the Apocalyptic Climate Narrative
A paper authored by Harry DeAngelo and Judith Curry that serves as a critique of the apocalyptic climate narrative was published this month. The narrative claims that humanity faces an existential threat from global warming and the only way to save the planet is to eliminate the use of fossil fuels. This narrative ignores all benefits of warming, CO2 fertilization and the benefits of fossil fuel use. It radically overstates the risks to humanity of continued global warming, which are manageable, not existential. The narrative ignores the enormous costs of achieving net zero human-caused greenhouse gas emissions.
The Introduction says “This paper details the flaws in the apocalyptic climate narrative, including why the threat from human-caused climate change is not dire and why urgent suppression of fossil-fuel use would be unwise. We argue that sensible public policies would focus instead on developing a diversified portfolio of energy sources to support greater resilience and flexibility to respond to whatever weather and climate extremes might occur. We identify nine principles for sensible US public policies toward energy and discuss implications of the flaws in the narrative for investors and their agents.”
Those principles are;
- We should not inflict costs on U.S. citizens – reduced overall economic prosperity, constrained individual choice, and diminished national security – by adopting public policies intended to mitigate global warming.
- We should not eliminate fossil fuels before we have technologically viable and cost-effective replacements.
- We should use “carrots” to foster investment in innovation in energy and science.
- We should not use “sticks” to punish consumption that generates greenhouse gasses.
- We should cultivate clean energy to reduce air pollution and energy independence for defence and economic security reasons.
- We should put major emphasis on the resuscitation of nuclear power.
- We should not focus narrowly on solar panels, wind turbines, and biofuels.
- We should not engage in backdoor regulation of fossil-fuel use by the Federal Reserve and the SEC.
- We should not use our power to impose credit policies toward developing countries.
This paper estimates the global social net benefits of CO2 emissions in 2020 using a 3% real discount rate is US $11 ± 4 per tonne of CO2. The narrative ignores that the social costs are likely between zero and negative US $15/tCO2.
This paper estimated the incremental cost will be 2024US $220 trillion (Ca $308 trillion). Russia, Ukraine and the CIS together would be required to spend an incremental increase of 15% of their GDP on net zero, India would spend an extra 8.4% of their GDP on net zero.
The European Blackout was caused by Unstable Solar Power
The Iberian Peninsula Blackout started Monday April 28th, 2025 at 12:33 CET. The power systems of Spain and Portugal suffered a total blackout. A small area in France near Spain was also affected but for only a very short time. The blackout lasted for about 10 hours in most of the peninsula and up to 24 hours in some southern areas. At least seven people in Spain and one in Portugal may have died due to the power outage. The total disconnected load is estimated at 30 GM. For context Alberta’s average 2024 electricity load was 10.1 GW. Just before the blackout, solar and wind power provided 59% and 12%, respectively of the total load.
Jonas Kristiansen Nøland, associate professor at the Norwegian University of Science and Technology wrote on Linkedin “Recent evidence indicates that Europe's worst blackout, occurring in the Iberian Peninsula, originated from an unstable power grid. This instability likely triggered the cascading chain of events that followed. In the half-hour leading up to the blackout, two episodes of power and frequency oscillations were observed in the Continental European synchronous area. Grid operators took actions to mitigate these oscillations.
The likely root cause of these undamped ‘inter-area oscillations’ was the weak, heavy loaded interconnecting lines and the inherently low inertia of the Spanish power grid at midday, with approximately 70% of generation provided by inverter-based solar and wind. Due to these unstable grid conditions, exceptionally high rates of change of frequency occurred, which became the final nail in the coffin. As a result, low-frequency load shedding was not able kick in to save the day. The critical tipping point came with the first generation loss at 12:32:57, involving roughly 2.2 GW, likely from solar PV generation in southwest Spain—a region dominated by solar power.”
Nøland goes on to say that the unstable grid was cause by the lack of inertia and over voltages. Electrical inertia in usually provided by spinning reserves of fossil fuel power plants. This inertia resists short-term (2 to 3 seconds) variations in frequency. The frequency in Europe should be 50 ± 0.050 Hz, or a tolerance of ±0.1%. Deploying additional synchronous capacitors (used for very short-term electricity storage) would “significantly increase system costs”.
Britain Faces Months-Long Blackouts Because of Net Zero
Nick Gutteridge of The Telegraph, U.K. wrote (paywalled) “The grid operator has raised concerns that the switch from dependable gas to intermittent wind and solar power will ‘reduce network stability’.” Richard Eldred wrote about the article “Britain’s rush to Net Zero could leave it vulnerable to months-long blackouts, as reliance on intermittent renewables strains the grid, escalating costs and jeopardizing energy security.” He also provided a large quote from the article.
The grid operator said “the cost to taxpayers of funding measures to prevent the system crashing was set to ‘increase significantly’ to £1 billion a year.” Government officials said it would take Britain “several months” to fully recover from a nationwide electricity outage. Concern has risen since a power outage at the Heathrow cause chaos at the airport on May 8, 2025. That disaster was caused by “decades-old equipment filled with flammable oil” which caught fire”. Britain’s electric system operator warned last month of an increased risk of outages due to continuing reduction of synchronous power generation from natural gas and nuclear in favor of renewables that provide no inertia for stabilizing the grid. A government risk assessment found that “depending on the cause of failure and damage, restoration of critical services may take several months.”
CliSci # 429 2025-04-22
Solar Madness in Germany: Subsidized Electricity Gets Dumped Abroad for Free
The solar energy annual load factor in Germany is about 11%. The load factor is the annual solar energy production as a percentage of the solar energy that would have been produced if generated at the maximum rate throughout the year. The total installed solar capacity in Germany reached 99.3 GW at the end of December 2024. In 2024, Germany's total electricity production was 431.7 terawatt-hours (TWh), equivalent to a continuous generation of 49.3 GW. The 2024 electricity generation, the first year without any nuclear energy contribution, was 14.6% less than in 2023. Meanwhile, the 2024 electricity consumption increase by 0.9% to 462 TWh.
The large solar capacity relative to the consumer demand causes negative prices when solar generation is high. In 2023, the proportion of hours with negative prices averaged 18% and in May 2023 it was as high as 31%. However, the negative prices are not passed on to the solar generators. If they were charged for producing electricity when it wasn’t needed then there wouldn’t be any negative prices. Excess solar power is given away free of charge to Germany’s neighboring countries. This is a crisis for Germany because solar power is extremely expensive. It is heavily subsidized with taxpayer’s money over more than 30 years. Germany has spent an estimated 189 billion euros (approximately US$222 billion) on wind and solar subsidies since 2000. Pierre Gosselin wrote “This sort of absurdity is what happens when politicians and bureaucrats take over energy engineering.” Energy market expert Björn Peters is calling for an end to the expansion of solar energy and a return to fossil fuels for electricity generation.
States with Renewable Mandates and Cap & Trade Pay the Highest Electricity Prices
This report from the American Legislative Exchange Council (ALEC) says “There is very little more important to modern society than energy and electricity. It is an essential aspect of virtually every part of our daily lives. However, throughout the United States, electricity prices vary greatly, depending on the way it is generated, delivered to consumers, and regulated.” Some policies affect the electricity supply while some increase the electricity demand by promoting electrification. This brief summary shows that consumers pay higher prices for their electric utilities in states that have more regulations on their energy sector.
The weighted average 2023 price of electricity across all sectors (residential and business) was calculated and ranked. The ALEC report analyzed three energy policies shaped by state legislatures: renewable portfolio standards (RPS), participation in a CO2 cap-and-trade program, and mandated rules regarding net metering. The grids’ reliability was determines by counting the number of major incidents, including power outages.
The most expensive electricity was in Hawaii at 39.72 US cents per kilowatt hour (¢/kWh), but the report notes that the state is disadvantaged due to its isolated location. California had the second most expensive electricity at 22.33 ¢/kWh. The least expensive electricity was in Wyoming and North Dakota; both at 8.24 ¢/kWh.
The three states (Wyoming, North Dakota and Idaho) with the lowest prices do not have RPS or cap-and-trade programs. Idaho does not have mandated net metering but Wyoming does have it. In contrast, the five states with the most expensive electricity all have RPS, cap-and-trade programs and impose net metering policy on its utilities. Net metering is very costly and unfair because it forces utilities to pay for electricity produced from roof-top solar panels when the demand is low, the supply is high and prices are low (or even negative). Utilities are forced to provide the solar panels owner the same amount of electricity when the demand is high, supply is low and prices are high. This gives roof-top solar panel owners a huge financial advantage over other consumers.
California had more reliability incidents than any other state, with hundreds of thousands affected over the year. Parts of the grid were down due to major incidents for over 82 hours over the year.
The three states with the lowest electricity prices had very reliable grid systems. Wyoming only had one power outage in the year, impacting less than 5,000 residents, demonstrating the grid’s remarkable resiliency. North Dakota has a good mix of energy sources with coal providing 56% of the state’s electricity. It had only three reliability incidents. Idaho had none of the three problematic policies studied and has a very reliable grid with no major outages.
France Holds Wind Industry Accountable for Golden Eagle Death
A French court on April 9, 2025 ordered the Bernagues windfarm at Hérault, France to cease operations for one year following the confirmed death of a golden eagle that collided with one of the farm’s turbine blades in January 2023. The golden eagle is a protected species in France. The windfarm’s operator was fined €200,000, half of which was suspended, and fined the operator’s director €40,000. The Aumelas wind farm in France was ordered on April 7, 2025 to suspend operations for four months due to the death of 160 protected birds.
This article says “Two recent studies underline the broader threat to golden eagles. One, in Ecological Applications, shows that annual mortality already exceeds the threshold that eagle populations can sustain. Another, in Biological Conservation, tracks a rise in turbine-related eagle deaths in the western U.S., from 110 in 2013 to 270 in 2024.”
The article in Ecological Applications presents estimates of allowable take levels of bald eagles and golden eagles that are sustainable. Golden eagles are unintentionally killed by electrocution on power lines and collisions with wind turbine blades. The mean annual survival rates for golden eagles ranged from 0.70 for first-year birds to 0.90 for adult birds. The author’s modeling suggests that female birds on average give birth to 0.53 young annually. The population size in the western United States has averaged about 31,800 individuals for several decades. The estimated median allowable take limit is 2200 per year (95% confidence 710–4180) to maintain a stable population. The actual take is estimated at 2600 golden eagles, which is slightly greater than the allowable take limit. The human-caused deaths are 59% of all deaths, including 74% in the bird’s first year. The human-caused deaths determined by counts of transmitter-tagged golden eagles were caused by shooting 30%, collisions 28%, electrocutions 23% and poisoning 19%. Any additional human-caused deaths may not be sustainable. That is, any additional wind turbines or power lines would cause declining populations unless mitigated by new protection policies.
The article in Biological Conservation gives estimates of annual wind turbine mortalities of golden eagles in the western United States from 2013 to 2024. The authors determined areas of lower and higher risk of wind turbine collision deaths. The paper says “From 2013 to 2024, estimated turbine hazardous volume in the lower- and higher-risk zone increased by 198 % and 119 %, respectively.” Annual median golden eagle mortalities more than doubled from 110 in 2013 to 270 in 2024, although estimates had high uncertainty. The paper concludes that the recent accelerated growth and expected future growth of wind power in the western U.S. could have a significant negative impact on the golden eagle population.
Global Vegetation Growth Reaches New Highs Due to Increased CO2 Fertilization
A new paper shows that Earth had its highest level of vegetation greenness in 2020 in the modern satellite records to 2020. The greening was found mostly in the boreal and temperate vegetation, attributable mostly to rising CO2 levels, climate warming and reforestation efforts. Vegetation absorbs CO2 from the atmosphere by photosynthesis, transforming it into vital carbohydrate, which is the beginning of the food chain. Satellite observations since 1982 have reported a widespread increase of the vegetated area, often called global greening, with CO2 fertilization due to human-caused CO2 emissions explaining 70% and warming explaining 8% of the greening trend. Scientists found in 2024 that about 55% of global land mass had shown an “accelerated rate” of vegetation growth. Another study finds a large fertilization effect on crops; “a 1 part per million increase in CO2 equates to a 0.4%, 0.6%, 1% yield increase for corn, soybean and wheat respectively.”
The COVID19 pandemic had multiple effects. It caused a short-term reduction in CO2 emissions and reduced aerosol pollution which increased solar radiation enabling plants to flourish in brighter sunlight. The study used utilized three groups of remote sensing-based vegetation indices; enhanced vegetation Index, solar-induced fluorescence and leaf area index. The paper reports that global vegetation reached its peak greening in 2020, continuing a long-term trend. This suggests a robust resilience and adaptability of global vegetation in the face of changing environmental conditions. Also, see this commentary and this news release about the study.
CliSci # 428 2025-04-08
NewsGuard Reviews CliSci Discussion of the Urban Heat Island Effect
NewsGuard reviewed our March 2, 2025 Climate Science newsletter CliSci # 426, item titled “Urban Heat Island and Why Location Matters”. That article stated “the location of surface temperature stations in England, U.K. does indeed matter" and “the official temperature record is corrupted by the urban heat island effect” (UHIE). NewsGuard wrote “NewsGuard found that scientific research has shown that urban heat islands have a minimal impact on surface temperature measurements.” They referenced two published papers about the UIHE that were discussed in the IPCC’s fourth assessment report (AR4). NewsGuard asked us to “comment on this apparently countervailing information.” My reply is here.
One of the referenced papers (Peterson 2003) was reviewed on Climate Audit by Stephen McIntyre. McIntyre discovered that many ‘urban’ sites “seemingly being at best very small towns and, in some cases, rural themselves.” The paper reviewed temperatures of 289 sites in the contiguous U.S.A. Only 25% of the sites are even included in the U.S. Historical Climatology Network (USHCN) of weather stations so they aren’t used in the official temperature indexes for monitoring climate. The paper compares the actual temperature differences in °C between rural and urban site after making adjustments for elevations and latitudes. Peterson claimed there was an insignificant difference between the temperatures of urban and rural sites. The temperature indexes are based on the temperature anomalies of each site, so only the temperature trends in °C/decade are used. The actual temperatures are irrelevant. McIntyre wrote “It’s hard to see exactly what a comparison of the left column [Urban] sites and right column [Rural] sites has to do with whether the urbanization affects the CRU, NOAA or GISS composites.”
McIntyre calculated the temperature trend difference between Peterson’s rural and urban station sites and found it to a highly significant trend of 0.05 °C/decade. He also found the trend of a group of 12 sites at cites to be 0.2 °C/decade greater that the rural sites. The same Peterson was the lead other of the other referenced paper (Peterson & Owen 2025). That paper concluded “Whether a UHI signal was found depended on the metadata used.” I show that the rural sites are contaminated by the UHIE. The greatest increase in temperature due to a population increase is at the lowest populations or population densities. I also show that the urbanization correction applied to the NASA’s GISS temperature index fails to remove the UHIE.
Radiation Transport in Clouds
Professors W. van Wijngaarden and W. Happer published a paper this year that reviews the dominant role of clouds in Earth’s climate. Clouds have a dramatic effect on the amount of shortwave sunlight radiation reaching the Earth’s surface. Longwave radiation from the air and clouds to the Earth, called downwelling thermal radiation, has wavelengths greater than 4.2 micrometers. There is a large difference of downwelling thermal radiation between clear and cloudy weather, with close to 340 W/m2 from cloud bottoms on overcast days and around 260 W/m2 in clear weather. Radiative transfer theory shows that doubling the concentration of carbon dioxide, for a cloudless sky only decreases radiation to space by 1%. The paper concludes that an increase in low cloud cover of only about 1% could largely compensate for the doubling of CO2.
Volcanic CO₂ Emissions Could be Three Times Higher than Anticipated
This article summarizes research that found conventional measurements of volcanic CO2 emissions might by underestimated threefold. Scientists have developed an advanced sensor that can detect volcanic gases with rapid speed and precision. The research team measured emissions at a volcano on the Caribbean Island of Montserrat using the new sensor mounted on a helicopter. The measurements show that the volcano emitted three times more CO2 than earlier studies had estimated. Previous studies focus on hot vents which release high concentrations of volcanic gases. However, many volcanoes have cooler hydrothermal vent systems that absorb the acidic gases, making them harder to detect. As a result, CO2 emissions from these cooler sources are often overlooked. The typical CO2 flux calculation of multiplying CO2/SO2 ratio measured at hot vents with SO2 flux underestimates total CO2 flux. The cold vents release gases at much lower CO2/SO2 ratios. The CO2 flux was 15 to 41 kg/s from hot fumaroles and 61 to 131 kg/s for the overall plume. The article says “The new technology exposes those hidden emissions, offering a more accurate quantification of the volcano's gas output. … Volcanoes still contribute less than 5% of global CO2 emissions, far less than human activities such as fossil fuel combustion and deforestation." The new technology could enhance early warning systems of impending volcanic eruptions.
Sea Ice Pattern Effect on Earth’s Energy Budget Is Characterized by Hemispheric Asymmetry
This paper investigates the Earth’s global radiative effect of changes in sea ice concentration (SIC) spatial distribution. The authors show that SIC-induced radiation anomalies at the top of the atmosphere are sensitive to the location of SIC reduction in each season. Computer model experiments suggest that SIC-induced surface warming is greater in the Arctic than in the Antarctic region. Less sea ice in the Arctic causes more cloud cover and a negative cloud feedback in the Arctic. The sea ice area and SIC reduction is significant in the Arctic, while there is no significant trend in annual mean Antarctic sea ice area during 1979-2018.
The actual pattern of SIC is different from the simulated CO2-induced global warming. The paper says “Unexpectedly, the trend of global change in ∆R is negative despite the statistically significant decrease of global SIC between 1980 and 2008, implying that global SIC reduction leads to planetary cooling during this period.” ∆R means the change in the top of the atmosphere radiation flow induced by a change in SIC, where positive ∆R causes warming. During this period, Antarctic SIC increases leading to a decrease in ∆R in the Antarctic region while Arctic SIC decreases leading to an increase in ∆R in that region. The sensitivity of ∆R in the Arctic to SIC changes is smaller than that in Antactica, so the cooling effect of increasing Antarctic SIC is greater than the warming effect of decreasing Arctic SIC. The paper says “As a result, the relationship between global SIC trend and ΔR trend is opposite from that under long- term global warming, and the sea ice pattern effect is important in determining the climate effect of sea ice cover changes.” Also see here.
The Post-1980s Warming Trend Has Improved European Life Expectancy
Warming is good, cooling is bad for humans and animal. This study examines how climate conditions influence life expectancy in Europe based on the impact of year-to-year temperature variability adjusted for economic changes. Life expectancy changes are linked to economic output. Milder than usual winters are linked to a reduced number of hospitalizations and lower mortality rates. The authors developed a model to disentangle the concurrent processes of economic development and climate change. Local temperature variations were characterized by heating degree days (HDD) and cooling degree days (CDD) above and below base temperatures. The calculations took into account variations in alcohol and tobacco consumption, and the gender-specific effects of social inequalities.
As expected, an increase of gross domestic product (GDP) per capita increased life expectancy by 0.41 years per 1000 Euros/capita. The impact of below optimal temperature was large. Annual 1000 HDD reduces life expectancy by 3.0 years and annual 1000 CDD reduces life expectancy by 2.4 years. Each of these three effects is reduced by a small logarithm term. In cold regions, the reduction of life expectancy reached 24 months. Conversely, the loss of life expectancy due to temperatures above the optimum was only discernible south of the Alps where it was at most 8 months. The most favorable regions for life were those with warm winters, yet moderated summer temperatures due to proximity to the sea. The difference in life expectancy due to climate reasons between the harshest and most benign regions was 19 months.
The change in life expectancy due to temperature change from the 1979–1982 average to the 2019–2022 average was calculated. Paradoxically, the highest gains were observed in regions that were already warm, amounting to up to 3 months. Some regions of Scandinavia actually would experience no change of life expectancy. Even after controlling for economic growth, a modest warming trend appears to offer a moderate benefit from the perspective of life expectancy. The authors suspect that economic growth contributes to gains in life expectancy partially by enabling adaptations to adverse climate conditions.
CliSci # 427 2025-03-16
Variable Vertical Land Motion and its Impacts on Sea Level Rise Projections
A new paper published in Science Advances provides new estimates of vertical land movement over California using high-resolution satellite data. The paper criticizes the IPCC’s 6th assessment report for relying on regional linear estimates of vertical land movement (VLM). This can result in over or under estimates of sea level rise relative to land. Nonlinear VLM, driven by factors such as hydrocarbon and groundwater extraction, can increase uncertainties in 2050 projections by up to 0.4 meters in certain areas of Los Angeles and San Diego. The paper’s abstract concludes “This study highlights the critical need to include local VLM and its uncertainties in sea level rise assessments to improve coastal management and ensure effective adaptation efforts.”
Global Navigation Satellite System (GNSS) stations provide precise measurements of land movement but the global coverage is uneven, leaving large gaps that miss areas with large magnitude VLM. Interferometric Synthetic Aperture Radar (InSAR) provides high spatial resolution estimates of VLM over large areas that can fill these gaps. The results confirm widespread tectonic land uplift of 1 to 3 mm/yr in northern California and subsidence around San Francisco and San Diego associated with the San Andreas Fault system over the period 2015-2023. Land subsidence rates of 5 mm/yr are found in parts of the San Francisco Bay, which is far greater than absolute sea level rise. Extraction of groundwater and the recharge of aquifers lead to variable surface responses. The paper says “Beside updating projections, InSAR-VLM time series can also assist in timely adaptation efforts, especially if VLM trends overly shift downward at certain times, leading to increased exposure to sea level rise.
Lightning Strikes Declines by over 40% over Shipping Lanes After Shipping Industry Cut Sulfur Emissions
Authors of a new paper used the recent 7-fold reduction in allowable ship fuel sulfur emissions to investigate the decline of lightning strikes along busy maritime shipping lanes in the tropical Indian Ocean and South China Sea. Maritime ship traffic leads to the emission of aerosol particles and associated precursors into relatively clean marine air. In January 2020, the International Maritime Organization (IMO) reduced the amount of allowable sulfur in fuel from 3.5 % to 0.5 % to curb effects of maritime shipping on air pollution. Previous work documented the enhancements of lightning over major shipping lanes as ship traffic gradually increased. Since January 2020, lightning over shipping lanes have fallen by over 40%. Observations of droplet number at the cloud base show a similar decline over the shipping lane. There is little consensus on the mechanisms or magnitudes of aerosol particle impacts on deep convective cloud systems. The abstract says “These results have fundamental implications for our understanding of aerosol–cloud interactions, suggesting that deep convective clouds are impacted by the aerosol number distribution in the remote marine environment.”
Several mechanisms have been proposed to explain how aerosol particles from ship emissions could enhance lightning frequency, all of which involve enhanced cloud droplet nucleation. One theory is that enhanced nucleation leads to changes to super-cooled liquid water and ice distributions and enhanced charge separation in the mixed-phase region of convective clouds. Another theory suggests that enhanced nucleation leads to an increase in the frequency or intensity of deep convection due to changes in the vertical distribution of humidity.
Medieval Warm Period Undeniable, Pronounced In Antarctica and Poland
Pierre Gosselin wrote this article that summarized two recent papers that show a pronounced Medieval Warm Perion in Antarctica and Poland. He wrote “The Medieval Warm Period, the natural warm phase between 700 and 1300 AD, cannot be reproduced by climate models because the simulations react primarily to CO2. Back then CO2 was not a factor because its concentration level in the atmosphere was pretty much constant. That’s why people would rather keep the Medieval Warm Period quiet.”
One paper reports on the historical changes in penguin population in the Ross Sea region of Antarctica. The researchers analyzed sediment cores from abandoned penguin colonies and reconstructed the changes in the populations of penguins over the past 1500 years. They found that the penguin population peaked between 750 and 1350 AD, possibly due to habitat expansion in a warmer climate during the Medieval Warm Period. The population trend also coincided with extreme swings in the El Niño and Southern Annular Mode atmospheric-ocean circulation patterns. These circulation patterns could increase the influx of nutrient-rich water, increasing the penguin population.
The other paper reported on four new reconstructions that the authors created from tree ring data and from biological proxies. The researchers also studied existing climate reconstructions that have been produced for Poland in the last two decades. The Medieval Warm Period probably occurred in Poland from the late 12th century to the first half of the 14th or 15th century. All analyzed quantitative reconstructions indicate that the Medieval Warm Period in Poland was comparable or even warmer than the average temperature in the period 1951-2000. The two new studies from Antarctica and Poland indicate that the natural climate factors still need to be much better understood in order to be able to incorporate them faithfully into climate models.
Trial of Mann v. Steyn: Post-Trial Motions Edition
Penn State climate “scientist” Michael Mann brought a lawsuit for defamation against Mark Steyn and Rand Simberg, as well as against two websites; National Review and CEI in 2012. Menton wrote, “Mann asserted that his reputation had been damaged by the Steyn and Simberg posts, which had compared Mann to fellow Penn Stater Jerry Sandusky. The point of comparison was that Penn State had investigated and cleared both men around the same time over allegations of misconduct — scientific misconduct in the case of Mann, sexual misconduct in the case of Sandusky.” National Review and CEI were dismissed from the case before the trial started. The case finally reached trial in January 2024 against Steyn and Simberg as defendants. The jury awarded $1 of compensatory damages against each defendant and awarded punitive damages of $1000 against Simbert and $1 million against Steyn. On January 10, 2025, Judge Irving ruled that Mann must pay National Review $530,820 for recovery of attorneys’ fees and costs. On March 4 Judge Irving reduced the punitive damages award against Steyn from $1 million to $5000.
Mann’s claim for damages was based on the theory that he lost government research grants due to the alleged defamatory blog posts. He claimed that he lost a $9 million grant, but later amended that amount to only $100,000 shortly before the trial. Mann said he realized that if the grant was won, most of the $9 million would have gone to institutions other than Penn State. Mann’s lawyers had maneuvered to get the $9 million dollar figure into an exhibit that would go before the jury, when they knew that that figure had been withdrawn as wrong. The remedy for this misconduct will be a monetary award to the defendants of the amount spent to deal with Mann’s misrepresentation. The Judge will make a decision of the award after reviewing presentations from the defendants. Menton guestimates it will be tens of thousands to $100,000.
The defendants put on substantial and definitive evidence of the flawed “science” in Mann’s Hockey Stick reconstructions — manipulation of data, incorrect error bands, suppression of adverse verification statistics, and so forth. Mann’s lawyer completely ignored this entire issue on their rebuttal case. The jury ignored the issue of fake science.
“Extreme Heat” Isn’t Accelerating Aging—And It’s Not Getting Worse
Several news sources have posted articles claiming that extreme heat may speed aging and increase the risk of diseases. This article by Anthony Watts says this is false and misleading. The story is based a study that claims that the authors are able to detect age enhancing DNA changes from blood tests done on older populations. They provide a map of the United States showing areas of heat risk days above 90 °F (32 °C). Watts wrote “However, data from Climate at a Glance shows that heat waves in the U.S. were far more severe in the 1930s, particularly during the Dust Bowl years, than anything we see today. NOAA’s temperature records confirm that the highest recorded temperatures and longest-lasting heat waves occurred nearly a century ago. If ‘extreme heat’ were truly an escalating crisis, we would see a consistent upward trend in heat wave frequency and intensity, but that is simply not the case.” The study also ignores the urban heat island effect, which has grown over the last several decades as cities populations increase. The study wrongly assumes that increased temperatures are the only reason for these aging changes in the DNA of people they studied. Several other environmental and personal factor contribute to faster aging, including UV radiation, diet, physical activity, poor air quality, obesity, and smoking. The study doesn’t take any of these into account, leading to a conclusion that cannot be verified as being only temperature related. The study only used data from 2010 to 2016, which is far too short to relate to climate change. The study ignored the dangers of cold weather. This study found that cold-related deaths vastly outnumber heat-related deaths worldwide, by nearly a 10-to-1 margin.
CliSci # 426 2025-03-02
Friends of Science Speaking Event and Book Launch
Common Sense on Climate and Energy: Let's Nix Net Zero – Together
March 11, 2025 – 7PM, Calgary, Alberta
Join us in-person or online for this live event which combines our formal Book Launch with a speaking event by two of the key authors of “Energy & Climate at a Glance.” Published by Canadians for Sensible Climate Policy (CSCP) and The Heartland Institute, this is the handy guide-book that Friends of Science Society has been promoting over the past month.
Lead author Ron Davison, P. Eng. and contributing author Paul MacRae will be presenting a synopsis of their work.
Ron Davison will open your eyes on the Paris Agreement, Net Zero and ‘the science’ of climate change in his talk titled: Net Zero – Climate Policy is All Pain, for Minimal Gain!
Paul MacRae will be discussing Why We Can’t Abandon Fossil Fuels, and Why We Shouldn’t Try.
Check out our event page for more information and to purchase tickets.
Tickets are $20 for Friends of Science members and $25 for non-members.
Ticket sales are ongoing until March 11th, while supplies last - No tickets will be sold at the door. Purchase your tickets today!
Paleo-Storm Reconstruction from Eastern Canada Aligns with Atlantic Hurricane Records
Storm records on millennial time-scales along the Atlantic coast of North America are scarce. This study published this month uses grain-size and geochemical data from cores drilled into two peatlands in Quebec’s Magdalen Islands. These Islands are an archipelago in the Gulf of St. Lawrence, located between Prince Edward Island, Anticosti Island and Cape Breton Island. Most previous studies utilized overwash deposits that result from the combination of storm surge and waves overtopping barrier beaches and are deposited in the backbarrier environments.
The two peat records reveal consistent storm signals over the past 4000 years. Three periods of heightened storm activity are 800–550 BCE, 600–800 BCE, and 1300–1700 CE. These periods align with overwash records of the past 2000 years across eastern North America. Storm activity was low during the Medieval Warm Period and high during the Little Ice Age. The negative phase of the Atlantic Multidecadal Oscillation are associated the greater storm and hurricane activity. The study found a potential antiphase relationship in storm activity between the region north of the Bahamas and in the Gulf of Mexico.
The study shows that there is greater storm activity during cooler times and less storm activity during warmer time. This is opposite to the assumption utilized in integrated assessment models used to calculate the impacts of warming of storm damages and the social cost of carbon dioxide.
Recent Decline in Global Ocean Evaporation Due To Wind Stilling
Accurately understanding changes in the hydrological cycle has profound implications for forecasts of climate change. Climate models assume that evaporation and precipitation will both increase in a warming world. Evaporation is the largest component of the global hydrological cycle. Basic physics show that evaporation rates tend to increase as the water heats up. However, climate science is much more complicated than basic physics. This recent paper reports that there has been a slowdown in the growth of global water vapor since roughly 2000s. This study used satellite‐based ocean evaporation estimates to explicitly quantify and attribute the global ocean evaporation (Eo) tendencies in different periods. The abstract say “Our findings are unexpected: despite rising sea surface temperatures, global Eo has decreased in the most recent decade. This phenomenon is due primarily to the reduction in wind speed, likely linked to changes in atmospheric circulation patterns associated with Earth's decadal climate variability. These results offer a deeper understanding of the complex ways climate change is reshaping the planet's water cycle.”
The global averaged annual Eo of four satellite products increased significantly at a rate of 3.4 mm/yr2 during 1988–2017. Positive trends were in 91% of the oceans. However, the Eo trend analysis shows a significant touring point at 2008. The Eo trend during the period 1988-2008 was +5 mm/yr2. After 2008 the Eo trend decreased to negative 1.5 mm/yr2. The paper says “This reversal, evident in the MEM of four satellite products, is robust, as similar contrasting tendencies are also observed in each individual satellite product”. During 2008–2017, approximately two‐thirds of the global ocean experienced declines in Eo.
Starting a Better Climate Conversation with Bjorn Lomborg
Bjorn Lomborg has teamed up with the Fraser institute to publish a series of 10 weekly articles in Canada’s National Post on climate and climate policy. The first article was published on Febuary 25th. Bjorn Lomborg is a political scientist and the Director of the Copenhagen Consensus Center and Visiting Professor at Copenhagen Business School. He researches the smartest ways to help the world and is best known for his books The Skeptical Environmentalist and Cool It. Bjorn Lomborg is no climate skeptic as he apparently believes the IPCC’s narrative on climate sensitivity. He wrote that “Global warming is a real problem. Climate economics has shown how this brings … more problems than benefits.” Friends of Science dispute this. Lomborg says the climate policies force people to use much more expensive energy, slowing economic growth. He shows that the climate-related global deaths have declined from about half a million per year in the 1920’s to less than 10,000 per year despite a growing population, which is a decline of 97.5%. The risk of clime-related death has declined by more than 99%. Costly climate policies in the United Kingdom (UK) have caused electricity prices to increase in real terms from 1994 to 2023 by a factor of 2.9. The electricity price in the USA over the same period has increased by only 10%. Prices are weighted average residential and industry electricity.
Humans, Not Climate Change, May have Wiped Out Australia’s Giant Kangaroos
The demise of most of Australia’s kangaroo species by 40,000 years ago may have had less to do with climate-caused dietary pressures and more to do with human hunters. Between 65,000 and 40,000 years ago, more than 90 percent of Australia’s large animal species went extinct. Over half were kangaroos. The common narrative was that the extinction of the giant kangaroos was due to both the arrival of humans and climate change which was thought to have dramatically reduced the animals’ dietary options. This article which summarizes a new paper shows that the changing climate theory doesn’t make any sense. To assess the possible role of dietary restrictions caused by the changing climate, scientists analyzed the teeth of 937 kangaroos, both fossilized and modern, studying tiny signs of wear and tear that point to what the creatures ate. The analysis suggests the long-gone kangaroos were generalists, consuming a variety of foods that would have helped them survive as the climate changed. Human hunters are the likely cause of the extinctions, not climate change.
Urban Heat Island and Why Location Matters
Ray Sanders wrote this article demonstrating that the location of surface temperature stations in England, U.K. does indeed matter. He compares simultaneous temperature reading at many temperature stations to their surroundings. The stations surrounded by built-up areas have significantly higher temperature that those stations with natural land surfaces. Sanders presents a map of the west London. The temperature stations are identified with green circles with numbers indicating the temperatures in °C. Sanders wrote “The headline image of west London shows the Battersea site (Class 5 just west of Brixton) reading 9°C whilst simultaneously Kew Gardens ( Class 2, north of Richmond) was reading 5 degrees colder at just 4°C. Moving further west, Heathrow (Class 3) rises again by 3 degrees to 7°C.” The class number represents the siting of the station, with class 1 meaning natural surrounds and class 5 represents the worst siting with urbanized surroundings. “It is beyond any reasonable debate that these above variations are not in any way representative of natural phenomena. Clearly these represent the known Urban Heat Island effects identified by official research. The sudden temperature increase surrounding Heathrow demonstrates the result of locating one of the world’s busiest and largest international airports there.” Several other examples are presented showing large temperature variation corresponding to the class ratings. The implication of the analysis is that the official temperature record is corrupted by the urban heat island effect.
CliSci # 425 2025-02-18
The Ronne Ice Shelf Survived the Last Interglacial
The projected melting of the West Antarctic Ice Sheet (WAIS) is the largest cause of uncertainty in long-term sea-level rise. The range of projections of future sea-level contributions from Antarctica for a given climate scenario is large. A new study published in Nature used the Skytrain Ice Rise (SIR) ice core data to show that the Ronne Ice Shelf was “close to its current extent” during most of the last interglacial (LIG) around 125,000 years ago. The sea level was several metres higher during the LIG than today. Antarctica and the Southern Ocean were warmer than today. SIR is an ice cap, independent from the main body of WAIS, that sits at the landward margin of the Ronne Ice Shelf adjacent to WAIS. SIR is expected to have remained an independent ice cap throughout the last glacial cycle because it is separated from the main ice sheet. Sodium (Na) is used as a sea salt indicator in the ice core record. Sea salt is transported and deposited to ice sheets from the saline surfaces of sea ice and the open ocean. To reach SIR, sea salt must travel from the ice front across the non-marine Ronne Ice Shelf, whose front is now 680 km from the drill site. Therefore, the salt concentration in the ice core is a proxy for the distance from the ice shelf edge to SIR.
The paper’s abstract says “Water isotope data are consistent with a retreat of WAIS, but seem inconsistent with more dramatic model realizations in which both WAIS and the large Antarctic ice shelves were lost. This new constraint calls for a reappraisal of other elements of the LIG sea-level budget. It also weakens the observational basis that motivated model simulations projecting the highest end of projections for future rates of sea-level rise to 2300 and beyond.”
Climatic Model Precipitation Simulations are Unreliable
A new paper by Demetris Koutsoyiannis discusses several metrics for measuring the appropriateness of a model by testing models against reality. The paper compares precipitation simulations of 37 climate models against reality as represented by the ERA5 reanalysis, where ERA refers to ECMWF ReAnalysis. Graphs were constructed for the period 1940 to 2023 separately for the North Hemisphere (NH) and the South Hemisphere (SH) on an annual precipitation rate and on an 8-year average precipitation rate. The graphs below shows the 8-year average of the climate model precipitation (thin lines) compared to the ERA5 data (thick lines) for the NH (left) and the SH (right).
The graphs show the large bias of the models, which is mostly negative for the NH and mostly positive for the SH. The paper says “Different models have largely different biases, which in most of them are very large. The large bias in precipitation certainly reflects the inappropriate modeling of the physical processes related to the hydrological cycle, starting with latent heat and evaporation. Nonetheless, Figure 7 shows that on a hemispheric basis, there is a correlation between models and reality, with an average of 0.31 for the NH and 0.11 for the SH. An interesting property is that each model’s precipitation at the NH is negatively correlated to that of the same model for the SH, with an average correlation of −0.61 for zero lag. This model property, however, does not correspond to reality: if this correlation is estimated from the ERA5 data, it is practically zero (−0.03).” The precipitation simulated by the climate models does not agree with reality on the annual scale, but there is some improvement on larger time scales on a hemispheric basis. “[W]hen the areal scale is decreased from hemispheric to continental, i.e., when Europe is examined, the model performance is poor even at large time scales.” Therefore climate models are not useful for hydrological purposes.
Biases in Snowfall and Cloud Phase in ERA5 and CMIP6 using Satellite Observations
This new study shows that a major reanalysis dataset and climate models do a poor job at simulating snowfall and clouds. Super-cooled liquid-containing clouds (sLCC) and snowfall from these clouds play a major role in Earth's radiative budget. The study found that the ERA5 reanalysis and 10 CMIP6 climate models consistently overestimate the frequency of sLCC and snowfall frequencies from sLCC compared to CloudSat–CALIPSO satellite observations. The abstract says “The biases are very similar for ERA5 and the CMIP6 models, which indicates that the discrepancies in cloud phase and snowfall stem from differences in the representation of cloud microphysics rather than the representation of meteorological conditions.” These sLCC dominate at latitudes higher than 45° and cover up to 20 %–30 % of the Earth, depending on the season. Super-cooled water droplets and ice crystals coexist at temperatures between −40 and 0 °C. A cloud with more liquid reflects more solar radiation back to space. Therefore, representing the cloud phase correctly in climate models has substantial implications for the estimate of the cloud feedback and future climate projections.
Two satellites, CloudSat and CALIPSO, CALIPSO have provided global estimates of cloud properties and snowfall since 2006. Both ERA5 and the CMIP6 mean overestimate the fraction of sLCC during all seasons. The most significant discrepancy is observed during boreal spring, with area-weighted-average differences between CloudSat–CALIPSO and ERA5 and the CMIP6 model mean of −11 % and −14 %, respectively. Interestingly, models with more sophisticated microphysics schemes do not necessarily perform better. In contrast to the CloudSat–CALIPSO observations, both ERA5 and the CMIP6 model mean have much higher frequency of snowfall from sLCC values (> 60 %). In regions where all three datasets have sLCCs, ERA5 and the CMIP6 model mean overestimate the frequency of snowfall from sLCC by ∼50 %. This indicates that ERA5 and the CMIP6 models produce snowfall much more frequently from sLCCs than observed.
Extended Crop Yield Data Do Not Support Upward SCC Revision
The Biden Administration raised its Social Cost of Carbon (SCC) estimate about fivefold based in part on projections of declining crop yields using a dataset published in 2014. The dataset didn’t include all relevant variables, usually changes in CO2, in about half of the 1722 records. Dr. Ross McKitrick reexamined the data and recovered 360 records with CO2 changes, increasing the sample size from 862 to 1222. McKitrick used multivariate regression modeling on the larger dataset and found that global average yield changes are positive out to 5 °C warming from 1900. The variables required for the analysis are changes in crop yield, CO2 levels, temperature and precipitation for several crop types. The agricultural component of the upward revised SCC was based on an analysis of the data by Moore et al. 2017 (M17). That analysis failed to include the CO2 fertilization effect in almost half of the records.
The M17 analysis imposed too severe diminishing marginal gains of CO2 fertilization. M17 presented graphs of yield effects from temperature change alone, excluding the offsetting effect of CO2 fertilization and adaptation. CO2 fertilization increases crop yields and increases crop’s water use efficiency. Farmers adapt to warming by changing to heat-resistant crop strains. The negative temperature effects are fully offset by gains from CO2 fertilization and adaptation. McKitrick wrote “Overall I conclude that climate change-related agricultural damage estimates in M17 are too pessimistic and the large implied revisions to the SCC are unsupported.”
A fire Deficit Persists Across Diverse North American Forests
This new study evaluated the burned area across North American forest in recent decades and pre-1880 historical times. The study found that a widespread fire deficit persists in recent decades across a range of forest types compared to historical fire regimes (pre-1880). The authors used the North American tree-ring fire-scar network (NAFSN), a multi-century record comprising >1800 fire-scar sites spanning diverse forest types and contemporary burned area data to determine the fire deficit. Despite recent increasing burned areas, they are not unprecedented considering the multi-century perspective offered by fire-scarred trees.
The paper says that wildland fire was common and widespread across many forests and woodlands in North America prior to the late 19th and early 20th centuries. In subsequent decades the practice of preventing and suppressing nearly all wildland fires resulted in a large decrease in average annual burned area up to the 1980s. The burned area has increased since the mid-1980s in parts of North America. Fire proxies from charcoal preserved in lake sediments and bogs suggest that there is considerably less area burned in the 20th century compared with previous centuries. Tree-ring fire-scar records provide annually resolved, site-specific information on fire occurrence. All analyses were conducted across NAFSN sites in the United States and Canada and by ecoregion forest types.
Based on the historical fire-scar record, NAFSN sites collectively would be expected to have burned 4346 times from 1984–2022, yet they burned 989 times, or only 23% of what would be expected under the historical fire regime. In all ecoregions except the Taiga & Hudson Plain, NAFSN sites exhibit a statistically significant fire deficit from 1984–2022. A statistically significant fire surplus was observed from 1984–2022 in the Taiga & Hudson Plain ecoregion, which is the forest region of northern Canada. Individual years with particularly widespread fire during the 1984–2022 period were not unprecedented in comparison with the active fire regimes of the historical period across most of the study region. In many western North American forests, particularly those represented by the tree-ring fire-scar sites analyzed, heavy fuel loads and increased fuel continuity have developed because of fire suppression, thereby increasing fire severity (the proportion of trees killed by fire) when forests inevitably burn.
CliSci # 424 2025-02-02
Rising Carbon Dioxide is Making the World's Plants more Water-wise
This study finds that as plants grow faster in response to rising carbon dioxide (CO2) they aren’t using more water to do it. This is excellent news as it shows that economic models used to justify carbon taxes are wrong to forecast economic cost to changes in water availability. This article about the study says “The globe is greening as plants grow faster in response to rising carbon dioxide. But a new analysis shows they aren't using more water to do it.” The study reports that land plants absorbing 17% more CO2 from the atmosphere than 30 years ago but are not using more water. The study says “Our confirmation of a global trend of increasing water use efficiency is a rare piece of good news when it comes to the consequences of global environmental change. It will strengthen plants’ vital role as global carbon sinks, improve food production, and might boost water availability for the well-being of society and the natural world. … We found that across the globe, boreal and tropical forests are particularly good at increasing ecosystem water use efficiency and uptake of CO₂. That is due in large part to the CO₂ fertilisation effect and the increase in the total amount of leaf surface area.”
Truths about Coral Bleaching: CO2 Warming vs Reduced Clouds Cover?
Jim Steele explains in this article that reduced cloud cover is the primary cause of coral bleaching. Cora reefs evolved 240 millions years ago when they formed a marvelously fluid symbiotic relationship with symbiodinium algae. At that time global average temperatures were about 14 °C higher than now. The algae symbionts absorbed CO2 and produced sugars via photosynthesis for the coral. In turn coral respiration produced CO2 that supported the algae’s photosynthetic production. Shallow-water coral are now restricted to the warmest ocean waters. Increased sunlight produces excessive dangerous Reactive Oxygen Species (ROS) like hydrogen peroxide during the light reactions of photosynthesis. Thus, many corals threatened by increasing ROS, have adopted the survival strategy of ejecting their mal-adapted algae (i.e. bleaching). Coral can re-absorb those algae later when the extreme light conditions subside or absorb new symbiodinium algae adapted to the higher light conditions. The major factor affecting solar insolation is changing cloud cover associated El Nino-La Nina oscillations. Changes in the sun’s insolation primarily drive bleaching by increasing ROS production during photosynthesis along with solar heating. In 2022 the Australian Institute of Marine Science (AIMS) reported the highest levels of coral cover across two-thirds of the Great Barrier Reef (GBR) in over 36 years.
Climate Change Isn’t Disrupting the Polar Vortex
Numerous mainstream news outlets have falsely claimed that the January record cold in the United States was caused by climate change disrupting the polar vortex, allowing cold air to break out of the Arctic into mid-latitudes. This article shows there is no solid evident to support these claims. The polar vortex is a low-pressure system in the stratosphere at 10 km to 50 km altitude that forms over the polar regions during the winter. Its strength is influenced by the natural variability like the North Atlantic Oscillation and ENSO. A change in the polar vortex can cause a similar change in the polar jet stream, which is in the troposphere at 8 km to 14 km above the surface. However, many Arctic cold air outbreaks happen without any change to the polar vortex, and sometime a disrupted polar vortex causes no change to the lower troposphere weather. There is no consistent trend in the frequency or intensity of polar vortex disruptions over the past several decades. The article states “Cold outbreaks similar to the current one have been documented regularly over the past two centuries, at least, including the brutally cold winters of the late 19th and mid-20th centuries.” A recent study found no statistically significant increase in jet stream waviness or meandering in recent decades. Climate models can’t reliably predict polar vortex behaviour in a warming world; some models predict warming will lead to a stronger polar vortex and others predict the opposite.
Strong to Violent Tornadoes Frequency Decline 59% over 69 Years
Over the years, the ability to observe and measure tornadoes has greatly increased. NOAA said “If a tornado occurs in a place with few or no people, it is not likely to be documented. Many significant tornadoes may not make it into the historical record since Tornado Alley was very sparsely populated during the 20th century. Today, nearly all of the United States is reasonably well populated, or at least covered by NOAA’s Doppler weather radars. Improved tornado observation practices have led to an increase in the number of reported weaker tornadoes …” The Enhance Fujita (EF) scale is a scale for rating tornado intensity, based primarily on the damage tornadoes inflict on human-built structures and vegetation. NOAA doesn’t publish graphs on its website of tornado counts by EF scale rating. It only shows the tornado count of all tornadoes, including the very weak EF0 and EF1, giving a very misleading impression of increasing tornado frequency. EF0 tornadoes have wind speeds as low at 100 km/h. The linear trend of reported EF0, EF1 and EF2 tornadoes 1955 to 2024 are +95, +26.5 and -16.3, respectively.
A graph of strong to severe (EF3-5) tornadoes from 1955 to 2024 is shown below. The best fit linear trend shows a significant declining trend of -53%. The tornado count decline is 5.3 tornadoes per decade. The data 1955-2023 is from NOAA's Storm Prediction Centre. 2024 data is preliminary.
This stacked bar graph shows the EF scale EF3 (220–265 km/h), EF4 (270–320 km/h) and EF5 (> 320 km/h) tornado counts. The declining linear trends of the intensity categories are; EF3 -4.04, EF4 -1.12, and EF5 -0.17 tornadoes/decade.
Detected Surface Solar Radiation Brightening in Europe
This paper used satellite date to find that Europe has experienced an increase in surface solar radiation, termed “brightening,” since the 1980s. Analyzing 61 locations distributed across Europe from 1983 to 2020, aerosols emerge as key driver of the solar brightening during 1983–2020. Cloud effects exhibit spatial variability, inducing a negative effect on surface solar radiation in the same period. In the period 2001–2020, aerosol effects are much smaller, while cloud effects dominate the observed brightening (2%/decade–5%/decade). The study therefore finds a substantial decrease in cloudiness over Europe in the first two decades of the 21st century.
This study compared various satellite surface solar radiation products against in-situ measurements over Europe. All products show an increase in surface solar radiation, or brightening, over the last 40 years over Europe. The brightening trends differ mostly due to different aerosol modeling approaches used by each product. Both the reduction of aerosols and the reduction of cloud cover since 2001 caused solar brightening in Europe, contributing to global warming.
Ottawa’s “Net Zero” emission-reduction plan will cost Canadian workers $8,000 annually by 2050
A new study by Dr. Ross McKitrick for the Fraser Institute provides an outlook through 2050 of Canada’s path to net zero by answering two questions: will the Government of Canada’s current Emission Reduction Plan (ERP) get us to net zero by 2050, and if not, is it feasible for any policy to get us there? First, a simulation of the ERP extended to 2050 results in emissions falling by approximately 70% relative to where they would be otherwise, but still falling short of net zero. The costs are huge: real GDP declines by 7%, income per worker drops by 6%, and the annual cost per worker exceeds $8,000. Second, at $400 per tonne, emissions decrease by 68%, but tripling the carbon tax to $1,200 per tonne achieves only an additional 6% reduction. At this level, the economic impacts are severe: GDP would shrink by 18%.
Correction to CliSci # 423
The article “NASA Space Mission Shows Canada Net Removal of CO2 is the World’s Largest” in CliSci # 423 had an incorrect date range of the data in two sentences and the table heading. The correct date range is 2015-2020. The fifth sentence should read “So Canada caused a net CO2 removal of 960 MtCO2/yr from the atmosphere over 2015-2020.” This was corrected on the website.
CliSci # 423 2025-01-21
NASA Space Mission Shows Canada Net Removal of CO2 is the Worlds Largest
An international team of scientist has calculated the net CO2 exchanges with the atmosphere of 187 countries, including the biomass exchanges due to photosynthesis and fires. The estimates are based on both in situ CO2 observations from NOAA and NASA's Orbiting Carbon Observatory-2 satellite data. The average 2015-2020 Canadian CO2 emissions from fossil fuels and cement production were 620 megatonnes CO2 per year (MtCO2/yr). However, Canada's biosphere absorbed a net 1,580 MtCO2/yr, which are 2.5 times our CO2 emissions. So Canada caused a net CO2 removal of 960 MtCO2/yr from the atmosphere over 2015-2020. Canada's net CO2 removal from the atmosphere is significantly greater than any other country. By contrast, China caused at net CO2 addition of 7,860 MtCO2/yr over the same period. The table below shows the CO2 emissions net of CO2 biomass absorption of countries with the three most negative and three most positive emissions.
Country |
Net CO2 Emissions
ave. 2015-2020 (MtCO2)
|
Canada | -960 |
Russia | -793 |
Peru | -793 |
India | +1,836 |
USA | +3,478 |
China | +7,860 |
Glacier Experts Uncover Critical Flaw in Sea-Level Rise Predictions
A new study shows temperate glacier ice flows more steadily than previously thought, leading to lower projections of sea-level rise. A co-author of the study explains that glaciers contain two types of ice; some ice called temperate ice is at their pressure-melting temperature and is soft and watery, like an ice cube melting on a kitchen counter, while other parts of glaciers have cold, hard ice, like an ice cube in a freezer. The hard ice characteristics are used as the basis of glacier flow models and forecasts. The new paper describes lab experiments that suggest a parameter of a glacier flow equation should be changed for temperate ice. The ice experiments used a ring-shear device with a hydraulic press than can apply 100 ones of force on the ice to simulate the weight of a glacier 800 feet thick. Ice is temperate near the bottoms and edges of the fastest-flowing parts of ice sheets and in fast-flowing mountain glaciers. Previous experiments were mostly done on hard, cold ice which showed a power function relationship between ice stress and its deformation rate. Temperate ice is linear-viscous over common ranges of stress expected near glacier beds. The paper’s abstract says “This linearity is likely caused by diffusive pressure melting and refreezing at grain boundaries and could help to stabilize modeled responses of ice sheets to shrinkage-induced stress increases.” Using the new values will lead to significantly lower projections of glacier flow with global warming leading to lower forecasts of sea-level rise.
Northern Hemisphere Hurricane Intensity Significantly Declines in 2024 despite Alarmist Hype
The Northern Hemisphere data for the 2024 hurricane season shows that the accumulated cyclone energy (ACE) was 455 as compiled by the Colorado State University, which is only 78.6% of the 30-year 1991-2020 average of 579 as show in this graph. The ACE accounts for the intensity and duration of all tropical storms and hurricanes. The highest ACE since 1990 was 880 in 1992, nearly double the 2024 value. This webpage shows that there are no significantly increasing trends in the frequency of hurricanes or major hurricanes or hurricane days. Larry Hamlin wrote in this article “These outcomes clearly dispute and invalidate decades long flawed and contrived climate alarmists claims that these global storms are growing stronger and more intense because of man-made climate change.”
The Energy Storage Fiasco -- How Soon Will It Be Abandoned?
Several articles were published showing that the amount of battery energy storage needed to make wind and solar work as the main power sources for an electricity grid is so large, and the costs so unaffordable, that battery storage was totally infeasible, see here, here and here. Francis Menton wrote “the amount of energy storage needed to enable a predominantly wind/solar grid to get through a year without hitting a blackout was in the range of 500 to 1000 hours of average electricity usage.” The low end to that range requires the wind and solar capacity to be far larger than that required to generate the total annual electricity demand. Lithium-ion batteries have an unfortunate downside that they occasionally catch fire spontaneously. Grid-scale batteries must store thousands of megawatt-hours of electricity, compared to maybe 100 kWh for an EV. New York State requires 8,500 to 17,000 GWh of battery storage capacity to prevent blackout of a wind/solar/battery electrical system but only 1.2 GWh of that has been built. Menton wrote “And yet, between May and July 2023, New York had had three large fires at the grid battery storage facilities.” California requires 15,000 GWh to 30,000 GWh of batteries for their planned wind/solar/battery system. They have 54 GWh of storage built so far. From September 2022 to January 2025 there were three large fires at battery storage facilities. The fire at Moss Landing facility started at January 16. The fire required a mandatory evacuation of 7600 acres, or about 12 square miles involving 1200 to 1500 residents. New York City plans to soon start construction of a large battery facility near Midtown Manhattan. A 12 square mile evacuation zone around this facility would require the evacuation of hundreds of thousands of people. Lithium battery fires release toxic gases that cause severe respiratory distress, skin burns, eye irritation, and even death.
AMOC study: Critical Ocean Current has not Declined in the last 60 Years
A new study by scientist from the Woods Hole Oceanographic Institution found that the Atlantic Meridional Overturning Circulation (AMOC) has not declined in 60 years, indicating that it currently is more stable than expected. The AMOC regulates the Earth’s climate by moving water though the Atlantic Ocean powered by winds and water density. Previous studies that reported a declining AMOC relied on sea surface temperatures (SST) but the scientists of the new study found that SST doesn’t work as well as initially thought. Using 24 climate models, the scientist found that temperature data did not accurately reconstruct the AMOC. They found that when the AMOC is stronger, more heat is released from the ocean to the atmosphere over the North Atlantic. The best data for surface heat fluxes over the North Atlantic come from reanalysis products. The paper states that air-sea heat flux anomalies in the North Atlantic are tightly linked to the AMOC and that “the decadal averaged AMOC has not weakened from 1963 to 2017.” This strongly suggests that a weakening of the AMOC is unlikely to occur in the foreseeable future.
CliSci # 422 2025-01-06
CLOUD Experiment Resolves Puzzle of New Tropospheric Aerosol Particles
Isoprene emitted by tropical forests may provide a globally important source of aerosol particles that influence clouds, according to a new study by the CLOUD collaboration at CERN according to a new study. CERN issued a press release about the study. The press release said “Isoprene emitted by tropical forests may provide a globally important source of aerosol particles that influence clouds, according to a new study by the CLOUD collaboration at CERN. Aerosols are microscopic particles suspended in the atmosphere. They play an important role in Earth’s climate system because they seed clouds and influence their reflectivity and coverage. CLOUD comprises a 26 m3 ultra-clean chamber filled with atmospheric gases and a suite of advanced instruments that continuously analyse its contents. A beam of charged pions are fired from CERN’s Proton Synchrotron to mimic the influence of galactic cosmic rays. CLOUD spokesperson Jasper Kirkby said “High concentrations of aerosol particles have been observed high over the Amazon rainforest for the past twenty years, but their source has remained a puzzle until now. Our latest study shows that the source is isoprene emitted by the rainforest and lofted in deep convective clouds to high altitudes, where it is oxidised to form highly condensable vapours. Isoprene represents a vast source of biogenic particles in both the present-day and pre-industrial atmospheres that is currently missing in atmospheric chemistry and climate models.” The CLOUD findings are consistent with aircraft observations over the Amazon, as reported in an accompanying paper. Together, the two papers provide a compelling picture of the importance of isoprene-driven aerosol formation and its relevance for the atmosphere. Kirkby said “This new source of biogenic particles in the upper troposphere may impact estimates of Earth’s climate sensitivity, since it implies that more aerosol particles were produced in the pristine pre-industrial atmosphere than previously thought”. Joanne Nova discusses the results and implications.
Antarctic Peninsula Warming
My article titled “Sustained Greening of the Antarctic Peninsula” in CliSci # 421 included the sentence “The Antarctic Peninsula is one of the fastest warming places on Earth.” It was based on the first paragraph in the referenced paper. One of the FoS Directors, Ian Cameron, questioned this statement. He noted that the Antarctic Land temperature trend, labeled as “SoPol-Land”, in the UAH lower troposphere temperature record was only 0.09 °C/decade, which is far less than any other land region. Using climate explorer, I plotted the annual land temperatures of the Antarctica Peninsula (AP) from the Climate Research Unit (CRU5) surface station data and the University of Alabama in Huntsville (UAU6) lower troposphere satellite land-only data for the years 1980 to 2023.
The graph shows that the CRU station temperature trend is 0.352 °C/decade. The UAH6 satellite temperature trend is negative at -0.052 °C/decade. There are numerous research stations on the AP and its population has increased over the years. The population of the Antarctic Peninsula varies seasonally, with around 1,000 to 1,500 people during the summer research peak and 200 to 300 people during the winter. I suspect that the primary reason for the large difference in temperature trends is due to the urban heat island effect.
Global Greening by CO2 Fertilization and Climate Change
A new paper uses robust statistical method to determine greening and browning trends from remote sensing data. This research used the Advanced Very High-Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) data from NOAA's polar-orbiting satellites. The scientists report that over the 42 years (1982 to 2023), the Earth experienced statistically significant vegetation trends over 38% of the land surface. Of this land area, 76% was greening while 24% were browning. Considering areas with NDVI values above 0.15, greening accounted for 85% of the significant trends with browning accounting for 15%. NDVI values from 0.1 to 0.5 indicate sparse to moderate vegetation, and values about 0.6 indicate dense green vegetation. The paper says “Although vegetation greening has been reported on all continents, it is particularly pronounced in Eurasia, including regions of Europe and China.” The authors claim that their significant trends methodology eliminates spurious trends and providing a more accurate assessment than previous assessments. The abstract says “These findings strongly validate the ongoing global greening of vegetation.” Kenneth Richards provides a summary of the paper. He wrote “In other words, greening trends dominate over browning trends at a ratio of about 4:1, or 80% to 20%.”
Large Antarctic Calving Events: Nothing Unusual
Many media outlets posted stories in 2017 about iceberg A-68 calving off of Antarctica’s Larsen C ice Shelf. CNN said we should be freaking out about it because of climate change. A new study says large calving events are nothing abnormal, nor should we worry about it. The study’s abstract says that the authors used “47 years of iceberg size from satellite observations. Our analysis reveals no upward trend in the surface area of the largest annual iceberg over this time frame. This finding suggests that extreme calving events such as the recent 2017 Larsen C iceberg, A-68, are statistically unexceptional and that extreme calving events are not necessarily a consequence of climate change.” A key point of the study is “There is no upward trend in the surface area of Antarctica's annual maximum iceberg between 1976 and 2023.” The analysis shows that extremely large calving events are likely typical of a healthy ice sheet system. There exists a quasi-stable cycle of calving front advance and retreat. The lack of an upward trend in annual maximum iceberg area could be attributed to an overall increase in the number of smaller calving events, which may inhibit the development of extremely large calving events.
Anthony Watts wrote “In other words, the media made a big ado about nothing. Will this new study by MacKie et al. disproving the climate alarm noise in 2017 get a lot of press? Probably not. It doesn’t fit the sensationalistic narrative of pending climate doom promoted by the media. They’d just as soon sweep this inconvenient truth under the rug than admit they weren’t just wrong, but wildly so.”
2024 Sets New Record for Warmest Year in Satellite Era (Since 1979)
Dr. Roy Spencer published a graph showing the global annual average lower troposphere temperatures ranked from warmest to coolest. According to the UAH6.1 satellite data, the 2024 temperature was 0.34 °C higher than in 2023. Spencer wrote “As seen in the following ranking of the years from warmest to coolest, 2024 was by far the warmest in the 46-year satellite record averaging 0.77 deg. C above the 30-year mean, while the 2nd warmest year (2023) was +0.43 deg. C above the 30-year mean. [Note: These yearly average anomalies weight the individual monthly anomalies by the number of days in each month.]” The 2023 temperature was 0.040 °C higher than the 3rd ranked year of 2016.
Why was 2024 so warm? It can’t be due to increasing greenhouse gas concentrations because they cause only a gradual upward creep of global temperatures. According to NOAA, the total GHG forcing has increased on average over the last four years by 0.040 W/m2 per year, which should cause a global warming of 0.016 °C/yr assuming a climate sensitivity of 2.0 °C per double CO2. ENSO is the most common cause of large yearly variability, but this graph of the multi-variate ENSO index shows that 2024 started with a small El Niño and ended with a small La Niña, so ENSO had little effect on the 2024 average temperature. The large temperature rise in 2024 was very likely due to the eruption of Hunga Tonga-Hunga Ha'apai (HTHH) volcano as shown by this paper. The paper “reveals that surface temperatures across large regions of the world increase by over 1.5 °C for several years.”