Climate Change Perceptions

I have been meaning to write this post for a long time because I think there is an important distinction about climate change that could potentially be affected by reducing GHG emissions that is not generally recognized.  I have postponed this article because I did not want to try to explain the driving factor for my concern – ocean and atmospheric oscillations.  Andy May is a petrophysicist who has a climate blog that recently published 14 articles about atmospheric oscillations that I have used in this post.

I am convinced that implementation of the New York Climate Leadership & Community Protection Act (Climate Act) net-zero mandates will do more harm than good if the future electric system relies only on wind, solar, and energy storage because of reliability and affordability risks.  Moreover, I take the heretical position that our understanding of the causes of climate change are not understood well enough to support the idea that reducing GHG emissions represents sound policy.  I have been a practicing meteorologist for nearly 50 years, was a Certified Consulting Meteorologist, and have B.S. and M.S. degrees in meteorology.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Background

Weather and climate are often confused.  According to the National Oceanic and Atmospheric Administration’s National Ocean Service “Weather reflects short-term conditions of the atmosphere while climate is the average daily weather for an extended period of time at a certain location.”  They go on to say: “Climate is what you expect, weather is what you get.” 

The standard climatological average is 30 years.  It is important to understand that programs like the Climate Act’s GHG emission reduction targets are intended to reduce global warming over longer time scales than 30 years.  Statements suggesting that even if aggressive mitigation reduces greenhouse gases  that temperature will still increase for 20-30 years due to inertia in the climate system are based on the premise that CO2 is the control knob for the climate.

I often hear and have noticed myself that “winters aren’t what they used to be” and that leaves are turning color later than the past.  The goal of this article is to show that there are climatic oscillations with time periods greater than 30 years that are likely causing these perceived examples of climate change.  However, I will show there is no connection between those observations and the value of the Climate Act as a potential reason to reduce GHG emissions in hopes of changing those observations.

Climate Oscillation Analysis

Earlier this year Andy May published 14 articles about climate oscillations in the oceans and atmosphere. I think his analysis is notable because it is data driven.  The basis of his analysis is articles describing observed oceanic and atmospheric changes, not modeled simulations.  Given the complexity of the interactions between oceans and the atmosphere and the poor understanding of their relationships, assuming that modeled simulations are credible is not reasonable. 

His articles provide compelling evidence that each of the 14 oscillations is natural.  I believe his work provides sufficient evidence proving that “each oscillation is natural and has been around since the pre-industrial period, or even earlier, and thus is natural and not random variability.”  This is important relative to claims that reducing the GHG emissions will affect global temperatures. 

May’s work consists of a statistical regression analysis of observed features in the oceans and atmospheres that have occurred over many years.  He uses the HadCRUT5  global average temperature data set used by the Intergovernmental Panel on Climate Change (IPCC) to track global warming in his analyses.  May offers the following caveat about his work.

Finally, this is a regression analysis to predict HadCRUT5 with climate oscillations to try and detect the climate oscillations that best correlate to “global warming.” This is not a climate model, it is not an attempt to make a climate model, it is only a statistical exercise. Statistics and statistical analysis are not proof of anything, it isn’t even scientific analysis, they are just useful tools to sort through datasets. Just as AI is not intelligent, statistics is not science, both are useful tools.

Climate Oscillations

May’s work consisted of the following posts:

In Climate Oscillations 1: The Regression May provides the following table that lists the oceanic and atmospheric oscillations considered in his series of articles.  For each of these oscillations he did a statistical regression analysis.  The first seven of the oscillations correlated with the GMST measured using HadCRUT5.  May points out that “HadCRUT5 is not representative of global climate, it is just an average temperature”.  Nonetheless, it is the primary climate change parameter.  The rationale for the Climate Act uses climate change and global warming interchangeably.

May Table 1. A list of the climate oscillations discussed and analyzed in this series. The first eight oscillations are listed in order of importance in modeling HadCRUT5, the remaining six did not add to the model. The links in this table will not work, to see the list in a spreadsheet with working links, download it here.

I am not going to review each post in this article but will describe several of the oscillations. If you want to review the articles and are content with a summary using Perplexity AI I did get a review of his work.    It notes:

The series begins with a foundational regression analysis that ranks fourteen major climate oscillations by their statistical correlation with HadCRUT5 global surface temperature. May’s analysis reveals that the top three oscillations—the Atlantic Multidecadal Oscillation (AMO), Western Hemisphere Warm Pool (WHWP) area, and Southern Annular Mode (SAM)—together explain 77% of HadCRUT5 variability since 1950. This finding directly contradicts the IPCC’s characterization of these oscillations as unpredictable “internal variability” with minimal influence beyond a few years.

The Atlantic Multidecadal Oscillation (AMO) has the most significant relationship with global mean surface temperature (GMST).  There are several definitions based on different measurements.  For example, Gray, et al. use detrended raw tree-ring measurements to demonstrate “a strong and regular 60-100 year variability in basin-wide (0-70°N) North Atlantic sea surface temperatures (SSTs) that has been persistent for the past five centuries.”

The general approach used by May is simple.  Figure 4 plots GMST using the HadCRUT 5 data and the AMO parameter using the HadSST 4.1 data.  It is obvious that the two parameters track well.  May used regression analysis to show the strength of the relationship. Note the variation in global temperature since 1850 shown in this graph.  The first challenge for proponents of the idea that CO2 is the driver of climate change is that it is acknowledged that it is only since 1950 that CO2 has affected global warming.  So, what happened in the past to cause the observed variations?   I do not think it is reasonable to claim that all the natural drivers that caused variations before 1950 stopped and global warming became entirely dependent upon CO2 since, but that is the argument used by Climate Act proponents.

May Figure 4. HadSST and HadCRUT detrended temperature anomalies plotted together. Both anomalies are from 1961-1990 originally but are from their respective linear least squares trends. This is updated from figure 2 in (May & Crok, 2024).

May points out:

The reason for the AMO SST 60-70-year pattern is unknown, but according to Gray et al. it extends back to 1567AD, so it is a natural oscillation of some kind. Some have speculated that it is a result of the thermohaline circulation in the North Atlantic or a “combination of natural and anthropogenic forcing during the historical era.” (Mann, Steinman, & Miller, 2020). But while interesting these ideas are speculative. Further if the oscillation has existed since 1567, it seems unlikely that it is caused by human CO2 and aerosol emissions.

The AMO has the best correlation with GMST in all the statistical analyses.  Combined with two other oscillations –  Western Hemisphere Warm Pool (WHWP) area, and Southern Annular Mode (SAM) these three  explain 77% of HadCRUT5 variability since 1950.

The Western Hemisphere Warm Pool Area (WHWP) is an area of abnormally warm ocean that extends from the eastern North Pacific (west of Mexico, Central America, and Columbia) to the Gulf of Mexico, the Caribbean, and well into the Atlantic during the WHWP peak in August and September.  Because this area is important to hurricane formation, the strength and extent of the warm pool is important.  May points out that the WHWP  combined with the Antarctic Oscillation or Southern Annular Mode and the AMO predict GMST well.  He concludes that “This suggests that The North Atlantic and the Southern Hemisphere circulation patterns correlate very well with global climate trends, CO2 may fit in there somewhere, but it must share the spotlight with these natural oscillations.”

The Southern Annular Mode/Antarctic Oscillation (AAO) is defined as the difference between the zonal (meaning east-west or circumpolar) sea level air pressure between 40°S and 65°S.  This parameter has a powerful influence on global climate and can affect weather in the Northern Hemisphere (Lin, Yu, & Hall, 2025), in particular the Warm Arctic-Cold Eurasian weather pattern that causes a lot of extreme winter weather. The AAO also affects the Indian summer monsoon and other eastern Asia weather phenomena.

Synthesis

The final article in the series, Climate Oscillations 12: The Causes & Significance, addressed the claim by proponents of the Climate Act that “ocean and atmospheric oscillations are random internal variability, except for volcanic eruptions and human emissions, at climatic time scales.”  May explains:

This is a claim made by the IPCC when they renamed the Atlantic Multidecadal Oscillation (AMO) to the Atlantic Multidecadal Variability (AMV) and the PDO to PDV, and so on. AR6 (IPCC, 2021) explicitly states that the AMO (or AMV) and PDO (or PDV) are “unpredictable on time scales longer than a few years” (IPCC, 2021, p. 197). Their main reason for stating this and concluding that these oscillations are not influenced by external “forcings,” other than a small influence from humans and volcanic eruptions, is that they cannot model these oscillations, with the possible exceptions of the NAM and SAM (IPCC, 2021, pp. 113-115). This is, of course, a circular argument since the IPCC models have never been validated by predicting future climate accurately, and they also make some fundamental assumptions that simply aren’t true.

This is a good point to remind readers that little fluctuations in incoming radiation have big impacts on the climate.  The Milankovitch theory is the most widely accepted cause of glaciation.  It states  that variations in earth’s orbit and tilt cause changes in the amount of sunlight that cause climate fluctuations strong enough to trigger continental glaciers. 

May’s analysis finds relationships between similarly small external variations that correlate with global surface temperatures.  Note however that proponents of CO2 as the control knob disregard all climate drivers but the greenhouse effect.    May explains:

Finally, oscillations are inconsistent with anthropogenic greenhouse gas emissions as a dominant forcing of climate change. Greenhouse gas emissions do not oscillate; recently they have only increased with time. So, we will examine the relationship between solar and orbital cycles and the climate oscillations. As Scafetta and Bianchini (2022) have noted, there are some very interesting correlations between solar activity and planetary orbits, and climate changes on Earth.

May’s final article describes multiple observed oscillations including a period of about ~64 years, ±5 years (Wyatt, et al., 2012), Nathan Mantua and colleagues (Mantua, et al., 1997) identified 20th century “climate shifts” which results in a major multidecadal climate oscillation of 22 to 30 years and there are shorter 2-, 5-, 5-, and 9-year observed oscillations.  Note that there also are other cycles that are longer than these.

The ~64 year oscillation is of particular interest.  Marcia Wyatt’s “stadium wave” hypothesis shows that a suite of global and regional climate indicators vary over roughly the same 64-year period.  Wyatt explains:

“Stadium wave” is an allusive term for a hypothesis of multidecadal climate variability. Sequential propagation of an “audience wave” from one section of sports fans to another in a sports arena – i.e. a “stadium wave” – is analogous to the premise of the climate stadium-wave hypothesis. It, too, involves sequential propagation of a signal. In the case of the climate stadium wave, propagation proceeds sequentially through ocean, ice, and atmospheric systems. Key to signal propagation is network, or collective behavior – a feature ubiquitous throughout natural and man-made systems, a product of time and self-organization.

I think of climate as a product primarily of the climate stadium wave cycle plus contributions from other oscillations.  May explains:

If we define “global climate change” as the observed changes in HadCRUT5 or BEST global mean surface temperature (GMST) as the IPCC does, then the oscillations that correlate best are the AMO and the global mean sea surface temperature (SST) as shown in figure 2. None of the other oscillations correlate well with GMST.

In figure 2, the gray curve is a 64-year cosine function. It fits the 20th century data but departs significantly around 2005 and before 1878. The early departure could be due to poor data, the 19th century temperature data is very bad, see figure 11 in (Kennedy, et al., 2011b & 2011). Data quality problems still exist today, but are much less of a factor and the departure after 2005 is likely real and could be caused by any combination of the of the two following factors:

  1. Human-emitted greenhouse gases.
  2. The full AMO/world SST/GMST period is longer and/or more complex than we can see with only 170 years of data.

It is probably a combination of the two. As discussed by Scafetta and Stefani, climate, orbital, and solar cycles are known to exist that are longer than 170 years. The fact that I had to detrend all the records shown in figure 2 testifies to that. It is also noteworthy that the ENSO ONI trend since 2005 is trending down; as shown in the last post. So is the current PDO trend. All the notable oscillations are not synchronized, teleconnections or not, climate change is not simple. The trends in figure 2 result from complex combinations of gravitational forces and teleconnections (Scafetta, 2010), (Ghil, et al., 2002), and (Stefani, et al., 2021).

Discussion

May gives a concise summary of the potential human influence that has never been considered by the State of New York:

Whether global warming is a problem or not is in dispute, but it is a fact that the world is warming, and some are concerned about it. What is the cause of the warming? Is it natural warming after the cold winters of the Little Ice Age? Is it caused by human emissions of CO2? Most of the natural ocean and atmospheric circulation oscillations examined in this post are not modeled properly (some say not modeled at all) in current global climate models (Eade, et al., 2022). The IPCC AR6 report admits that the AMO (they call it the “AMV”) signal in the CMIP6 climate models is very weak, specifically on page 506:

“However, there is low confidence in the estimated magnitude of the human influence. The limited level of confidence is primarily explained by difficulties in accurately evaluating model performance in simulating AMV.” (IPCC, 2021, p. 504)

In other words, the models that predict gloom and doom that are used as the rationale that we must reduce New York GHG emissions don’t accurately predict the oscillation that correlates best with global temperatures.  If you cannot model this relationship, then the likelihood that future temperature projections are accurate is zero.

In addition, NYSERDA presentations at meetings consistently attribute the latest extreme weather events to climate change.  Maybe someday I will explain why I think that is completely divorced from reality and only serves to support the narrative that there is an existential threat.  In the meantime Roger Pielke, Jr. recently eviscerated this line of reasoning and those that continually use it.  He points out that this approach is “counter to the terminology, frameworks, and assessments of the IPCC and the broad base of research on which the work of the IPCC is based upon.”  I strongly recommend his article as definitive proof that the Hochul Administration picks and chooses the “science” to fit their narrative.

Conclusion

The intent of this article was to explain why anecdotal “evidence” of climate change is no more than recognition that there are weather pattern cycles that currently show warming.  It does not mean that there is conclusive evidence that continued GHG emissions will inevitably increase global temperatures.  There is overwhelming evidence that the current warming cycle will eventually reverse.  This does not mean that GHG emissions are not a factor but does mean they are a tweak not the primary driver.  This combined with the fact that New York GHG emissions are so small relative to global emissions that we cannot meaningfully affect global emissions means that GHG emission reductions for the sake of the climate is a useless endeavor.

There is No Existential Threat from Climate Change

Anthony Watts has summed up my problems with claims that climate change is an existential threat in a post entitled “Is Climate Change Real? Short Answer: Yes — But It’s Complicated.”  This post reproduces the article with my annotated comments.

I am convinced that implementation of the New York Climate Leadership & Community Protection Act (Climate Act) net-zero mandates will do more harm than good if the future electric system relies only on wind, solar, and energy storage because of reliability and affordability risks.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 500 articles about New York’s net-zero transition.  I also am an air pollution meteorologist with bachelor and master of science degrees in meteorology and was a Certified Consulting Meteorologist before I retired with nearly 50 years of experience. The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Overview

The Climate Act established a New York “Net Zero” target (85% reduction in GHG emissions and 15% offset of emissions) by 2050.  The Climate Leadership & Community Protection Act Section 1. Legislative findings and declaration, subsection 3 defines the alleged threat and goal: “Action undertaken by New York to reduce greenhouse emissions will have an impact on global greenhouse gas emissions and the rate of climate change.”  I have tried to argue against this point many times, but I think Watts has provided a concise well-documented case that the basic premise that New York can have an effect on the rate of climate change is misplaced.

Is Climate Change Real?

Anthony Watts prepared the post addressing this question because he gets asked this a lot.  His response to the question shows that New York does not need to rush to comply with the aspirational Climate Act schedule and targets set by politicians during the always rushed and hectic New York budget process.  Watts provides a simple primer that makes five key points.  Note that all the bold passages in the following quotes were highlighted by Watts.

1. The Basics: Climate Does Change

His first point cuts to the nub of the problem.  Climate change is real and is always occurring.  That makes it easy for everyone to have an impression that the climate isn’t what it used to be.

First, let’s be clear — climate change is real in the literal sense. The Earth’s climate has been changing for billions of years. We have geological records showing periods that were much warmer (like the Eocene, with crocodiles in the Arctic), and much colder (like the Ice Ages that covered North America in glaciers).

Even more recently, we have the Holocene Climate Optimum, significantly warmer than present day:

Watts explains that there is a nuance to the fact that the climate is changing.  Those nuances are being ignored as he notes:

So, yes — the climate changes, and it always has. The debate isn’t about whether it changes, but whyhow fast, and how much humans are influencing it today. The debate is also about how accurately we are able to detect temperature change, plus the overreliance on climate models to predict the future rather than actual data.

2. What the “Consensus” Says — and Where It Falls Short

Folks like me who publicly decry the claim of an existential threat must confront the consensus argument he describes. 

The mainstream position (IPCC, NOAA, NASA, etc.) holds that recent warming — roughly 1.1°C since the late 1800s — is largely due to increased CO₂ from human activity, mainly fossil fuels.

But here’s the rub: this view is heavily dependent on climate models, which are notoriously uncertain.

The fact that the extreme risks claimed are based on models is frustrating because I know the limitations of model projections and they never get mentioned in the mainstream coverage of climate change. The only thing I would add to his remarks is that he could have included many more issues.

As someone with a meteorology background, I can tell you models struggle with cloud feedbacksocean cyclessolar variability, and regional forecasts — all of which are crucial to understanding climate.

When models are run backward, they often fail to replicate past climate variability accurately — like the Medieval Warm Period or the Little Ice Age — unless they’re tuned heavily. That calls into question their reliability for long-term projections.

3. Natural Variability: The Elephant in the Room

As Watts explains, natural variability is not understood well.  I think the thing to keep in mind is that this variability is driven in large part by the patterns of the upper air steering currents like the jet stream.  The massive flooding due to Helene in western North Carolina was caused by a rare weather pattern that stalled the storm in one place.  A similar pattern occurred in 1916 so today’s level of CO2 and warming were not the cause.  Unfortunately, we don’t know what caused that pattern or if it was just normal variability.  Watts describes the variability of observed warming:

A lot of warming in the 20th century happened before CO₂ rose sharply post-WWII. For example:

  • The warming from 1910 to 1940 occurred with much lower CO₂ levels.
  • Then there was a cooling trend from the 1940s to 1970s, despite rising CO₂ emissions during that time period.

Clearly, natural factors — like solar cycles, ocean oscillations (PDO, AMO), volcanic activity, and cloud dynamics — are still in play and possibly underestimated in mainstream assessments.

Keep in mind that the consensus says that the recent warming was caused by GHG emissions, but I don’t see any big difference between that warming and the previous one that was “natural”.  We know there are natural factors in play but we don’t understand them well enough to be able to discern what the impact of the greenhouse effect is relative to them.

4. The CO₂ Connection: Overstated?

The second complicating factor is that the greenhouse effect is real and increased CO2 in the atmosphere should also increase warming.  However, as Watts explains even that fact is conditional on at least one factor rarely mentioned.

CO₂ is a greenhouse gas, no question. But its effect on temperature is logarithmic — meaning, the more CO₂ you add, the less warming you get per unit. The first 100 ppm has the biggest impact, and we’re well past that as seen in the figure below.

Moreover, satellite data from UAH and RSS shows a slower warming trend than surface datasets like HadCRUT or GISS. That discrepancy raises questions about data adjustments, urban heat island effects, and instrument biases.

I addressed a couple of warming trend issues in two recent articles about measuring temperature trends here and here.  This primer just touches the surfaces of isues.

5. Are We in a Crisis?

Ultimately the only reason we are being forced to endure the insane transition policies that defy physics, math, and economics is the existential threat.  Watts points out problems with that claim. 

Even if we accept that humans are influencing climate, the notion that we’re in an “existential crisis” is unproven. Extreme weather trends (hurricanes, tornadoes, droughts) don’t show clear worsening patterns once you account for improved detection and population growth in vulnerable areas such as coastal developments.

The Intergovernmental Panel on Climate Change (IPCC) agrees, suggesting a “low confidence” in many current and future weather events being affected by climate change. The “existential crisis” view is heavily dependent on climate model projections, which are notoriously uncertain and refuted by data.

Sea level is rising — but at a slow, linear pace of about 3 mm/year. That’s about 12 inches per century, similar to what’s been observed since before industrial CO₂ emissions.

Away from the bluster and hype in the real-world evidence is clear that even if there is a potential for massive impacts due to climate change, the pace observed is slow and not accelerating.  That means that we have time to consider and modify the politically motivated schedule of the New York Climate Act.

Bottom Line

I cannot conclude this post any better than Anthony Watts did in his bottom line.

Yes, the climate is changing. It always has. The idea that global climate must be unchanging is simply wrongheaded. The real issue is how much of today’s change is due to human activity, how reliable our predictions are, and whether proposed policy responses are justified — or likely to do more harm than good.

At Watts Up With That, we’ve been pointing out for years that this issue is riddled with confirmation bias, model overconfidence, and selective reporting. There is no justification for shutting down economies or reshaping civilization based on the incomplete science of climate change.

So yes, climate change is real, but no, it’s not a crisis.

Temperature Trend Measurement Uncertainty

Late last year I published an article that described the difficulties involved with a fundamental aspect of the climate change debate – measuring global temperature trends.   This article describes an analysis of a data set that compares two different ways to calculate the daily temperatures used to determine global temperature trends.  Ray Sanders reproduced Stephen Connolly’s description an analysis that shows how temperature measurement techniques affect trend interpretation. 

The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Background

My fifty-odd year career as an air pollution meteorologist in the electric utility sector has always focused on meteorological and pollution measurements.  Common measurement challenges are properly characterizing the parameter in question, measuring it in such a way that the location of the sensor does not affect the results, and, when operating a monitoring system, verifying the data and checking for trends.  On the face of it, that is easy.  In reality, it is much more difficult than commonly supposed.

I prepared the previous article to highlight recognized instrumental and observational biases in the temperature measurements.  One problem is measurement methodology.  The longest running instrumental temperature record is the Central England Temperature (CET) series that extends back to 1659.  In the United States temperature data are available back to the mid 1800’s. In both cases the equipment and observing methodology changed and that can affect the trend.  Too frequently, when observing methods change there is no period of coincident measurements that would enable an analysis of potential effects on the trends.

Only recently have computerized data acquisition systems been employed that do not rely on manual observations, and even now many locations still rely on an observer.  For locations where temperature records are still manually collected, observers note the maximum and minimum temperature recorded on an instrument that measures both values daily.  A bias can be introduced if the time of observation changes.  If observations are taken and the max-min thermometers are reset near the time of daily highs or lows, then an extreme event can affect two days and the resulting long-term averages.  Connolly’s work addresses another bias of this methodology.

Uncertainty Caused by Averaging Methodology

The issue that Stephen Connolly addressed in his work was the bias introduced when a station converts from manual measurements of maximum and minimum temperatures to a system with a data acquisition system. Typically, those data acquisiton systems make observations every second, compute and save minute averages, and then calculate and report hourly and daily averages. 

Ray Sanders explained that he came across Stephen Connolly’s analysis of temperature averages based on data from the Valentia weather station on the south west of the Republic of Ireland. He asked Stephen if he could refer to his work, to which he agreed on the condition he duly credited him. So by way of a second-hand proxy “guest post” I have reproduced Stephen’s unadulterated X post at the end of this article.  I offer my observations on key parts of his work in the following.

I highly recommend Connolly’s article because he does a very good job explaining how sampling affects averages.  He describes the history of temperature measurements in a more comprehensive way than I did in my earlier post.  He explains that the daily average temperature reported from a manual observation station calculated as the average of the maximum and minimum temperature (Taxn) is not the same as an average of equally spaced observations over a 24-hour period (Tavg).  Using a couple of examples, he illustrates the uncertainties introduced because of the sampling differences. 

Connolly goes on to explain that:

In 1944 Met Éireann did something a bit unusual, they started measuring the temperature every hour. Rain or shine, sleet or snow, the diligent staff of Met Éireann would go out to the weather station and record the temperature. Between January 1944 and April 2012 when the station was replaced with an automated station only 2 hours were missed.

The data enabled Connolly to compare the two techniques to calculate the daily average temperature.  In his first graph he plots the difference between the two techniques as blue points. Overlaid is the 1 year rolling average as a red line. He states that Tavg is greater than Taxn in Valentia on average by 0.17oC (std deviation 0.53, N=29339, min=-2.20, max=3.20).

Connolly plots the difference between the two averaging approach and notes that:

If we just look at the rolling average, you can see that the relationship is not constant, for example in the 1970’s the average temperature was on average 0.35ºC warmer than the Meteorological estimate, while in the late 1940’s, 1990’s and 2000’s there were occasions where the Meteorological estimate was slightly higher than the actual average daily temperature.

He goes on:

It’s important to highlight that this multi-year variability is both unexpected and intriguing, particularly for those examining temperature anomalies. However, putting aside the multi-year variability, by squeezing nearly 30,000 data points onto the x-axis we may have hidden a potential explanation why the blue points typically show a spread of about ±1ºC… Is the ±1°C spread seasonal variability?

The shortest day of the year in Valentia is December 21st when the day lasts for approximately 7h55m. The longest day of the year is June 21st when the day lasts for 16h57m. On the shortest day of the year there is little time for the sun to heat up and most of the time it is dark and we expect heat to be lost. So we expect the average temperature to be closer to the minimum temperature during the winter than during the summer.

I found this line of reasoning interesting:

We can check the seasonal effects in the difference between Tavg and Taxn by looking at a time dependent correlation. As not everyone will be familiar with this kind of analysis, I will start by showing you the time dependent correlation of Tavg with itself in the following graph.

The x-axis is how many days there are between measurements and the y-axis is the Pearson correlation coefficient, known as r, which measures how similar measurements are averages across all the data. A Pearson correlation coefficient of +1 means that the changes in one are exactly matched by changes in the other, a coefficient of -1 means that the changes are exactly opposite and a correlation coefficient of 0 means that the two variables have no relationship to each other.  The first point on the x-axis is for 1 day separation between the average temperature measurements.

When I was in graduate school, a half century ago weather forecasting performance was judged relative to two no-skill approaches we called persistence and climatology.  Connolly explains that persistence is assuming that “Tomorrow’s weather will be basically the same as today’s”.  This graph shows that the approach is approximately 82% accurate.

The graph also illustrates the accuracy of the second no-skill forecast – climatology.  In other words the climatology forecast for the average temperature is simply the average for the date.  At a year separation the r value of 0.67 days that 44% of today’s average temperature can be explained as seasonal for this time of year. What this means is that actually the persistence forecast is only explaining 38% better than the climatological forecast

Connolly notes that the maximum and minimum temperatures behave the same and concludes that the above graph basically tells us what to expect when something is strongly seasonal.

Connolly goes on to ask what happens when we plot the time-dependent correlation of Tavg-Taxn? He shows the results in the following graph.

The 1 day correlation is 0.19, this tells us that approximately 4% of today’s correction factor between Tavg and Taxn can be predicted if we know yesterday’s correction factor. The seasonality is even worse, the 6 month correlation coefficient is -0.02 and the 1 year correlation coefficient is +0.07.

He points out that this answers the question whether this is seasonal variability and concludes that the ±1°C spread is not seasonal variability.  The important point of this work is that this means that if we only know daily average temperature based on the average of the maximum and the minimum temperature then comparison to the average measured using a data acquisition system the two methodologies could be anywhere between ±1°C different

He provides another graph to illustrate this.

The x-axis is Tavg and the y-axis is Taxn. Now obviously when the average daily temperature is higher, the average of the minimum and maximum temperatures is also higher and so we get a straight line of slope 1, but the thickness of the line represents the uncertainty of the relationship, so if we know Taxn is say 15°C then from this graph we can say that Tavg is probably between 13.5°C and 16.5°C.

Here is the important point:

Now because most weather stations were not recording hourly until recently, most of our historical temperature data is the Taxn form and not the Tavg. That means that if Valentia is representative then the past temperature records are only good to ±1°C. If somebody tells you that the average temperature in Valentia on the 31st of May 1872 was 11.7°C, the reality is that we just do not know. It’s 95% likely to have been somewhere between 10.6ºC and 12.7ºC.

He ends his analysis with another graph

In this last graph the blue points show the average Taxn of each year at Valentia since 1873 with vertical error bars showing the 95% confidence interval. The red points show the average Tavg for each year starting from 1944 with error bars showing the annual variation. The blue poking out from under the red shows the difference, even on the scale of a yearly average between the Meteorologist’s estimate of average temperature and the actual average temperature.

Discussion

Connolly explains:

Valentia Observatory is one of the best weather stations globally. the switch to automated stations in the 1990s, we can now get precise average temperatures.  Thanks to the meticulous efforts of past and present staff of Valentia Observatory and Met Éireann, we have 80 years of data which allows comparison of the old estimation methods with actual averages.

The takeaway from Connolly’s evaluation of these data is that out “historical temperature records are far less accurate than we once believed.”

I second Sanders acknowledgements of the work done by Connolly:

I would like to thank Stephen for allowing me to refer to his excellent research. Whatever one’s views are on the validity of the historic temperature record of the UK, this evaluation has again highlighted one area of many where there are significant questions to be asked regarding long term accuracy.

Conclusion

I would like to thank Stephen for allowing the posting of this excellent research.  One fundamental truth I have divined in my long career is that observed data are always more trustworthy than any model projection.  However, there are always limitations to the observed data that become important when trying to estimate a trend. 

I think these results are important because they highlight an uncertainty that climate catastrophists ignore.  I will concede that average temperatures are likely warming but the uncertainty around how much is within the observational uncertainty.  In other words, the claims the magnitude of the observed warming is not known well.  The science is not settled on the amount of warming observed.

Climate Science New Year Rant

As I age, I am becoming less willing to play along with the Climate Leadership & Community Protection Act (Climate Act) narrative that there is an existential threat to mankind from man-made climate change and that an energy system that relies on wind, solar, and energy storage can solve that threat.  One aspect of playing along is to appease supporters by accepting that there is a reason to reduce GHG emissions and agreeing that solar and wind resources should be part of the future electric energy system.  Ron Clutz’s recent article “Lacking data, climate models rely on guesses” included information that spurred this article.

I am convinced that implementation of the New York Climate Act net-zero mandates will do more harm than good if the future electric system relies only on wind, solar, and energy storage because of reliability and affordability risks.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 480 articles about New York’s net-zero transition.  The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

The Climate Act established a New York “Net Zero” target (85% reduction in GHG emissions and 15% offset of emissions) by 2050.  The authors of the Climate Act believed that “our State could rapidly move away from fossil fuels and instead be fueled completely by the power of the wind, the sun, and hydro” and that “it that it could be done completely with technologies available at that time (a decade ago)”.  In my opinion we need a feasibility analysis to determine if this presumption is correct.  This article addresses the questions: should we be trying to reduce GHG emissions in hopes of affecting the climate and even if we accept that decarbonization is a worthy goal should we try to rely on wind and solar.

Is There an Existential Threat?

Keep in mind that climate models provide all the evidence that there is an existential threat.  Despite the constant claims in the main stream media, attributing extreme weather events to man-made climate change is a claim no one without a vested interest in that answer is willing to make.  Ron Clutz’s recent article “Lacking data, climate models rely on guesses” described the response to a question about climate model accuracy by Dr. Keith Minor.  The following is parts of the summary from Ron’s post.

A recent question was posed on  Quora: Say there are merely 15 variables involved in predicting global climate change. Assume climatologists have mastered each variable to a near perfect accuracy of 95%. How accurate would a climate model built on this simplified system be?  Keith Minor has a PhD in organic chemistry, PhD in Geology, and PhD in Geology & Paleontology from The University of Texas at Austin. 

Minor responded with bolds by Clutz:

I like the answers to this question, and Matthew stole my thunder on the climate models not being statistical models. If we take the question and it’s assumptions at face value, one unsolvable overriding problem, and a limit to developing an accurate climate model that is rarely ever addressed, is the sampling issue. Knowing 15 parameters to 99+% accuracy won’t solve this problem.

The modeling of the atmosphere is a boundary condition problem. No, I’m not talking about frontal boundaries. Thermodynamic systems are boundary condition problems, meaning that the evolution of a thermodynamic system is dependent not only on the conditions at t > 0 (is the system under adiabatic conditions, isothermal conditions, do these conditions change during the process, etc.?), but also on the initial conditions at t = 0 (sec, whatever). Knowing almost nothing about what even a fraction of a fraction of the molecules in the atmosphere are doing at t = 0 or at t > 0 is a huge problem to accurately predicting what the atmosphere will do in the near or far future.

These problems boil down to the challenge of measuring the meteorological parameters necessary to initiate weather and climate models.   The reference to t = 0 relates to the start time of the model. Minor explains that there are many sources of variability within the models themselves too including:

  • The inability of the models to handle water (the most important greenhouse gas in the atmosphere, not CO2) and processes related to it;  e.g., models still can’t handle the formation and non-formation of clouds;
  • The non-linearity of thermodynamic properties of matter (which seem to be an afterthought, especially in popular discussions regarding the roles that CO2 plays in the atmosphere and biosphere), and
  • The always-present sampling problem.

Minor goes on to describe how these issues affect weather forecasting and how more sampling could improve certain forecasts.  He concludes:

So back to the Quora question, with regard to a cost-effective (cost-effect is the operational term) climate model or models (say an ensemble model) that would “verify” say 50 years from now, the sampling issue is ever present, and likely cost-prohibitive at the level needed to make the sampling statistically significant. And will the climatologist be around in 50 years to be “hoisted with their own petard” when the climate model is proven to be wrong? The absence of accountability is the other problem with these long-range models into which many put so much faith.

Clutz also references a quote by esteemed climate scientist Richard Lindzen that I think sums up whether we should rely on climate models to make the policy decision to transition away from fossil fuels.   In a presentation (here) Lindzen states:

I haven’t spent much time on the details of the science, but there is one thing that should spark skepticism in any intelligent reader. The system we are looking at consists of two turbulent fluids interacting with each other. They are on a rotating planet that is differentially heated by the sun. A vital constituent of the atmospheric component is water in the liquid, solid and vapor phases, and the changes in phase have vast energetic ramifications. The energy budget of this system involves the absorption and re-emission of about 200 watts per square meter. Doubling CO2 involves a 2% perturbation to this budget. So do minor changes in clouds and other features, and such changes are common. In this complex multifactor system, what is the likelihood of the climate (which, itself, consists in many variables and not just globally averaged temperature anomaly) is controlled by this 2% perturbation in a single variable? Believing this is pretty close to believing in magic. Instead, you are told that it is believing in ‘science.’ Such a claim should be a tip-off that something is amiss. After all, science is a mode of inquiry rather than a belief structure.

Can We Transition Away from Fossil Fuels

A recurrent theme at this blog is that the electric energy system absolutely needs new technology to achieve decarbonization.  Responsible New York agencies all agree that new Dispatchable Emissions-Free Resource (DEFR) technologies are needed to make a solar and wind-reliant electric energy system work reliably.  Because DEFR is needed and because we don’t know what should be used, I think that the Climate Act schedule needs to be reconsidered or at least paused.

I believe the only likely viable DEFR backup technology is nuclear generation because it is the only candidate resource that is technologically ready, can be expanded as needed, and does not suffer from limitations of the Second Law of Thermodynamics. I do concede that there are financial issues that need to be addressed.  The bigger issue is that DEFR is needed as a backup during extended periods of low wind and solar resource availability, but nuclear power is best used for baseload energy.  I estimate that 24 GW of nuclear could replace 178 GW of wind, water, battery storage.  Developing nuclear eliminates the need for a huge DEFR backup resource and massive buildout of wind turbines and solar panels sprawling over the state’s lands and water.  Until the New York Energy Plan settles on a DEFR solution the only rational thing to do is to pause the implementation process.

Lest you think that I am the only skeptical voice about the viability of an electrical energy transition relying on wind and solar resources I list some recent articles below.

Thomas Shepstone describes a fact sheet from the Empowerment Alliance that outlines why the electric grid is headed to a crisis:

America’s Electrical Grid Crisis is on the brink of a crisis that no one is talking about. Government mandates and pledges from utilities to achieve “net zero” emissions by 2050 or sooner have led to the closure of traditional power plants fueled by coal, natural gas and nuclear energy.

However, the wind and solar energy that is supposed to replace these sources is intermittent, unreliable and artificially supported by government subsidies. “Net zero” policies may sound nice on paper but they are not ready for practice in the real world.

In fact, the crisis may have already begun. A recent capacity auction by the largest U.S. electrical grid operator resulted in an over 800% price increase for these very reasons. And, everyday Americans are going to pay the price through higher bills for less reliable electricity.

  • One study of electricity plans in the Midwest found that, “Of the 38 major investor-owned utilities spanning the Great Lakes region, 32 are pledged to net zero by 2050 or sooner. Of the seven states analyzed in this report, three have net zero mandates by law, one has net zero mandates through regulation and the other three have no net zero mandates at the state level.”
  • “The Midcontinent Independent Systems Operator, the grid operator for much of the Midwest, projects that by 2032, none of the five Great Lakes states in its territory will have enough electricity capacity to meet even the most conservative projection of demand load.”
  • “Wind and solar cannot be relied on as a one-for-one replacement of existing generation sources, like coal, natural gas and nuclear. If the grid relies on forms of generation that are uncontrollable and unreliable, it must also maintain backup sources that are controllable and reliable. Because wind and solar production can fall to near zero at times, utilities may need to maintain up to another grid’s worth of generation capacity.”

Source:

Joshua Antonini and Jason Hayes, “Shorting The Shorting The Great Lakes Grid: Great Lakes Grid: How Net Zero Plans Risk Energy Reliability,” Mackinac Center for Public Policy, 2024

Thomas Shepstone describes a report by the Fraser Institute regarding the real costs of electricity produced from solar and wind facilities, compared to other energy sources.  Tom highlights the money paragraphs with his emphasis added:

Often, when proponents claim that wind and solar sources are cheaper than fossil fuels, they ignore [backup energy] costs. A recent study published in Energy, a peer-reviewed energy and engineering journal, found that—after accounting for backup, energy storage and associated indirect costs—solar power costs skyrocket from US$36 per megawatt hour (MWh) to as high as US$1,548 and wind generation costs increase from US$40 to up to US$504 per MWh.

Which is why when governments phase out fossil fuels to expand the role of renewable sources in the electricity grid, electricity become more expensive. In fact, a study by University of Chicago economists showed that between 1990 and 2015, U.S. states that mandated minimum renewable power sources experienced significant electricity price increases after accounting for backup infrastructure and other costs. Specifically, in those states electricity prices increased by an average of 11 per cent, costing consumers an additional $30 billion annually. The study also found that electricity prices grew more expensive over time, and by the twelfth year, electricity prices were 17 per cent higher (on average).

Finally, Chris Martz compares the impacts of wind and solar vs. nuclear power. I should note that he is not including DEFR support in his estimates. He concludes:

In order to power the same number of homes that a 1,000 MW nuclear power plant can, it would require either:

• For 𝐬𝐨𝐥𝐚𝐫 𝐏𝐕: Approximately 4,000 MW of installed power (equivalent to four nuclear facilities) and 24,000 acres of land (some 37.5 × as much land area than a nuclear plant).

• For 𝐨𝐧𝐬𝐡𝐨𝐫𝐞 𝐰𝐢𝐧𝐝: Approximately 2,800 MW of installed power (equivalent to 2.8 nuclear facilities) and 89,600 acres of land (some 140 × as much land area than a nuclear power generation station).

But, I should caution you that these estimates are in fact conservative. Why? Because they do 𝒏𝒐𝒕 take into consideration land area required for battery storage due to their intermittency in overcast sky conditions, low wind speed and/or overnight.

Conclusion

It is terrifying that the rationale and proposed solution to a New York policy that could cost hundreds of billions is based on fantasy.  Richard Lindzen describes the made-up rationale: “In this complex multifactor system, what is the likelihood of the climate (which, itself, consists in many variables and not just globally averaged temperature anomaly) is controlled by this 2% perturbation in a single variable? Believing this is pretty close to believing in magic.”  Keith Minor explains that even if this perturbation was the climate change driver that we can never provide enough data to to ensure that a model could accurately project the impacts.  The myth that wind and solar can replace fossil fuels on the schedule mandated by the Climate Act is dependent upon the fantastical notion that a resource that does not exist can be developed, tested, permitted, and deployed by 2040.

I can only conclude that allowing politicians to set energy policy will turn out to be an unmitigated disaster.

“Powerless in the storm” Climate Industry Misdirection

A version of this post appeared at Watts Up With That

I came across a paper that concludes “The US power grid is proven to be highly reliable in general; however, the resilient and reliable grid operation is increasingly challenged by severe weather events–events that are increasing in frequency and magnitude due to climate change.”  I have many issues with this paper, but I am only going to discuss one.  Apparently peer reviewed papers today require marginal support for claiming increasing severity because everyone knows that climate change affects the frequency and magnitude of severe weather events

This analysis is typical of the reports used to justify the Climate Leadership & Community Protection Act (Climate Act).  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 400 articles about New York’s net-zero transition. This post explains why the connection between this work and climate change is tenuous.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Powerless in the Storm

The paper in question is  “Powerless in the storm: Severe weather driven power outages in New York State, 2017–2020” (Flores NM, Northrop AJ, Do V, Gordon M, Jiang Y, Rudolph KE, et al. (2024) Powerless in the storm: Severe weather-driven power outages in New York State, 2017–2020. PLOS Clim 3(5): e0000364. https://doi.org/10.1371/journal.pclm.0000364)

The only proof cited to “support climate change is increasing weather variability” is the reference to this sentence: “The power grid’s vulnerability to severe weather events becomes even more critical in the context of climate change, which is expected to increase weather variability and prevalence of extreme events (e.g., storms, wildfires, heatwaves, floods)”.  The reference included cites the latest Intergovernmental Panel on Climate Change report: IPCC, 2022: Climate Change 2022: Impacts, Adaptation, and Vulnerability. Contribution of Working Group II to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change [Portner H.-O., Roberts D.C., Tignor M., Poloczanska E.S., Mintenbeck K., Alegrı´a A., Craig M., Langsdorf S., Lo¨schke S., Mo¨ ller V., Okem A., Rama B. (eds.)]. Cambridge University Press. Cambridge University Press, Cambridge, UK and New York, NY, USA, 3056 pp., https://doi.org/10.1017/9781009325844

The Other Side of the Story

However, if the authors were to look at the actual IPCC report rather than what they assumed it would say about the example weather events (storms, wildfires, heatwaves, floods) the narrative falls apart.

The CO2 Coalition published a paper prepared by Richard Lindzen, William Happer, Steven Koonin on April 16, 2024 titled Fossil Fuels and Greenhouse Gases Climate ScienceRichard Lindzen, Professor of Earth, Atmospheric, and Planetary Sciences, Emeritus  Massachusetts Institute of Technology; William Happer Professor of Physics, Emeritus Princeton University; and Steven Koonin  University Professor, New York University, Senior Fellow at the Hoover Institution reviewed what the IPCC documents actually said about these extreme weather events.  The paper explains:

Hurricanes. A deep analysis of the facts reveals that “the data and research literature are starkly at odds with this message” — “hurricanes and tornadoes show no changes attributable to human influences.” Id. pp. 111-12. Further, “There has been no significant trend in the global number of tropical cyclones nor has any trend been identified in the number of U.S. land-falling hurricanes.” U.S. Global Climate Research Program, 3rd National Climate Assessment, Appendix 3, p. 769 (footnotes omitted).

Wildfires. There is a powerful new source of data on wildfires, “Sophisticated satellite sensors first began monitoring wildfires globally in 1993.” Id. p. 142.

The result of this new source of data is totally contrary to what is in the news. Unsettled cites NASA data and others that show the global area burned by fires declined each year from 1998 to 2015:

“Unexpectedly, this analysis of the images shows that the area burned annually declined by about 25% from 1998 to 2015.” Further, “Despite the very destructive wildfires in 2020, that year was among the least active globally since 2003.” Id. p. 142.

Heat Waves. On extreme temperatures in the U.S., we all agree: “The annual number of high temperature records set shows no significant trend over the past century, nor over the past 40 years.” Koonin, supra, p. 110.

Flooding: US data shows “modest changes in US rainfall during the past century haven’t changed the average incidence of floods.” Globally, data from the IPCC shows that there is “low confidence regarding the sign of trend in the magnitude and/or frequency of floods on a global scale.”  We all agree with the summary in Unsettled: “we don’t know whether floods globally are increasing, decreasing, or doing nothing at all.” Id. p. 137.

Discussion

I have nothing to add to the main point that the authors of this paper just assumed that the IPCC found that extreme weather events were increasing despite evidence in the latest report to the contrary.  The peer review process did not call them out on this. 

For the record the authors, their roles and affiliations follow:

Nina M. Flores

ROLES Conceptualization, Data curation, Formal analysis, Writing – original draft, Writing – review & editing

AFFILIATION Department of Environmental Health Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America

Alexander J. Northrop

ROLES Conceptualization, Data curation, Writing – review & editing

AFFILIATIONS Department of Environmental Health Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America, Vagelos College of Physicians and Surgeons, Columbia University, New York, New York, United States of America

Vivian Do

ROLES Conceptualization, Writing – review & editing

AFFILIATION Department of Environmental Health Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America

Milo Gordon

ROLES Conceptualization, Writing – review & editing

AFFILIATION Department of Environmental Health Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America

Yazhou Jiang

ROLES Conceptualization, Writing – review & editing

AFFILIATION Department of Electrical and Computer Engineering, Clarkson University, Potsdam, New York, United States of America

Kara E. Rudolph

ROLES Conceptualization, Methodology, Writing – review & editing

AFFILIATION Department of Epidemiology, Mailman School of Public Health, Columbia University, New York, New York, United States of America

Diana Hernández

ROLES Conceptualization, Writing – review & editing

AFFILIATION Department of Sociomedical Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America

Joan A. Casey

ROLES Conceptualization, Funding acquisition, Methodology, Writing – review & editing

AFFILIATIONS Department of Environmental Health Sciences, Mailman School of Public Health, Columbia University, New York, New York, United States of America, Department of Environmental and Occupational Health Sciences, University of Washington, Seattle, Washington, United States of America

One final point is that my impression of the analysis is that the authors had pre-conceived conclusions in mind and tortured the data to get the results desired.

Conclusion

I have to assume that this is an example of the Climate Industry’ Misdirection Campaign described recently by Kip Hansen.  All of the authors are associated in some way with public health departments at universities.  I doubt that any of them has any background in climatology or meteorology beyond a possibly a class or two in introduction to Climate Change – The Existential Threat.  Today it is sufficient to just note that extreme weather is getting worse due to climate change to hype the results claimed because the peer reviewers know that is “true”.

Natural Climate Variability

A recent Associated Press story noted that “For the 10th consecutive month, Earth in March set a new monthly record for global heat — with both air temperatures and the world’s oceans hitting an all-time high for the month, the European Union climate agency Copernicus said.”  It went on to state that “Climate scientists attribute most of the record heat to human-caused climate change from carbon dioxide and methane emissions produced by the burning of coal, oil and natural gas.”  This post provides evidence that human-caused climate change was not the primary cause for the records.

The rationale used for New York’s Climate Leadership & Community Protection Act (Climate Act) that reducing GHG emissions will affect climate is of special interest to me.  However, I question whether we know enough about natural climate variability to legitimately make that claim.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 400 articles about New York’s net-zero transition. The opinions expressed in this post do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Observed Climate Variability

The video Climate the Movie: The Cold Truth includes a very good description of historical temperatures and CO2 trends.  It provides examples why claims that today’s observations indicate unprecedented heat in earth’s history are wrong.  In geologic time scales temperatures today are not at all unusual and because we are in an ice age all previous non-ice age geologic epochs were warmer.  Over the last 2,000 years there also have been periods of warmer temperatures.  The video goes on to compare CO2 trends over those periods to show that there is no link. 

In a recent post I addressed the basic tenet of anthropogenic global warming catastrophists, like the authors of the Climate Act, that the correlation between CO2 and global warming evident since 1976 proves that CO2 is the control knob for climate.   Andy May prepared an Annotated Bibliography for Climate the Movie that includes a section titled “From 1945 to 1976 the world cooled”.  It includes the following plot of global temperatures and carbon dioxide.  Climate Act proponents believe that increasing temperatures since the end of the Little Ice Age are caused by increases in CO2.  This graph does not support that claim.  From 1850 to 1910 temperatures trend slightly down and CO2 trends slightly up.  From 1910 to 1944 there is little change in the CO2 trend but the temperature trends up markedly.  CO2 emissions don’t start to rise significantly until the end of World War II in 1945 but from 1944 to 1976 the global temperature trends down.  For the remaining two periods shown in the graph temperature and CO2 correlate well.

The caption highlights the key point.  There is good correlation between CO2 concentrations in the atmosphere after 1980 but the correlation is poor before that.  I believe this shows that natural climate variation caused the 1910 to 1944 warming.  I do not believe that anyone has proven that the same natural climate drivers are not affecting the recent warming. 

The National Oceanic and Atmospheric Administration (NOAA) recently posted a comment that contradicts the existential threat narrative and supports those who argue natural climate variability is the main driver of climate change.  It states that “The amount of CO2 in the atmosphere today is comparable to around 4.3 million years ago, when sea level was about 75 ft higher than today, the average temp was 7 degrees F higher than in pre-industrial times, & large forests occupied areas of the Arctic that are now tundra.”  Climate the Movie shows that going back further in time that CO2 levels were much higher than today. It is not clear to me why there is supposed to be an existential threat to society when temperature and CO2 concentrations were higher in the past and the ecosystems survived.

Recent Warming

The claims for recent global temperature records reference NASA satellite data.  This data set only goes back to 1979 but it provides the greatest representative coverage of the globe because it does not depend on randomly spaced surface measuring stations.  In the following graph note the large spike in recent months.

Note that the spikiness in these measurements is not reflected in the atmospheric concentrations of CO2 measurements.  According to NOAA’s CO2 measurements:

The global surface concentration of CO2, averaged across all 12 months of 2023, was 419.3 parts per million (ppm), an increase of 2.8 ppm during the year. This was the 12th consecutive year CO2 increased by more than 2 ppm, extending the highest sustained rate of CO2 increases during the 65-year monitoring record. Three consecutive years of CO2  growth of 2 ppm or more had not been seen in NOAA’s monitoring records prior to 2014. Atmospheric CO2 is now more than 50% higher than pre-industrial levels.

If CO2 really is the control knob, then why is there so much inter-annual variation in temperature at the same time there is so little variation in the CO2 trend?  The only possible explanation activists have is that there are some natural variation processes.  Picking and choosing CO2 as the cause of the increasing trend while simultaneously acknowledging that there also are natural processes affecting the observed temperatures does not seem to be a particularly strong position to me.

Most Recent Warming

The Associated Press article claimed that “Climate scientists attribute most of the record heat to human-caused climate change from carbon dioxide and methane emissions produced by the burning of coal, oil and natural gas.”  The reality is that not all climate scientists support the claim that most of the record-breaking heat was caused by anthropogenic greenhouse gases.    

Javier Vinós described the recent warming explaining that this spike in temperatures marked the warmest period recorded by instruments and that the recent change was exceptional.  He found that “the temperature increase from the previous record was the largest in 153 years, at +0.17°C. This level of increase from previous records is remarkable, even for a year that has been recorded as the warmest on record.”  If there had been a spike in GHG emissions that preceded this warming spike, then I would be more supportive of the CO2 is the control knob theory.  It turns out that there was no spike in human emissions but there was a natural spike.  The Tonga-Hunga underwater volcanic eruption blasted unprecedented amounts of water vapor into high levels of the atmosphere.  Water vapor is more effective than CO2 as a greenhouse gas so this could be part of the reason for the recent warming spike.

There is another natural phenomenon likely responsible for some of the warming.  Surface water temperatures in the Pacific Ocean oscillate between warm (El Niño) and cold phases (La Niña ) of the El Niño-Southern Oscillation or “ENSO”.  The winter of 2023 occurred during an El Niño when the ocean releases heat into the atmosphere and has been associated with marked increases in global temperatures.  However, the 2023 El Niño was a weak year so its contribution to the observed warming was minimal.

In an article entitled “State of the climate – summer 2023“, Judith Curry examined the top of the atmosphere radiation balance.  As of June 2023, her analysis suggests that the water vapor increase in long-wave radiation warming from the Tonga-Hunga underwater volcanic eruption was offset by the short-wave aerosol particle cooling.  She gave other reasons for the observed warming records:

The exceptionally warm global temperature in 2023 is part of a trend of warming since 2015 that is associated primarily with greater absorption of solar radiation in the earth-atmosphere system.  This increase in absorbed solar radiation is driven by a slow decline in springtime snow extent, but primary by a reduction in reflection from the atmosphere driven by reduced cloudiness and to a lesser extent a reduction in atmospheric aerosol.  Any increase in the greenhouse effect from increasing CO2 (which impacts the longwave radiation budget) is lost in the noise.

She lists three reasons for the warming.  The slow decline in springtime snow extent has been linked to the warming trend as we come out of the Little Ice Age.  Clouds affect global temperatures.  Within the atmosphere more low clouds reduce temperatures by reflecting more sunlight but increased high clouds increase temperatures.  Particles or aerosols also scatter light and can affect temperatures by blocking sunlight.  She attributes the observed warming to the reduction in reflection from the atmosphere driven by reduced low-level cloudiness and to a lesser extent a reduction in atmospheric aerosol particles.  Low-level cloudiness trends are not well understood and are not included in climate models.  The aerosol changes are attributed to changes in the sulfur content of ship fuel. Most importantly, she points out that increasing CO2 effects are “lost in the noise” which directly contradicts the Associated Press article.

Conclusion

The rationale for the multi-billion Climate Act net-zero transition is the alleged link between climate change and greenhouse gas emissions.  Undoubtedly the emissions increases have some greenhouse effect on global temperatures, but the effects of natural climate variability not only must have been responsible for all of the historical variations in global temperatures but also appear to be the primary driver even during the most recent period when carbon dioxide emissions and global temperatures are well-correlated. The rationale for the Climate Act transition is weak at best.

 Burn, Hollywood, burn

This article was originally published at Watts Up With That.

Irina Slav on energy Substack is described as “All things energy. Challenging the dominant narrative because facts matter”.   Her latest article “Burn, Hollywood, burn” calls out the blatant indoctrination and propaganda associated with Hollywood today.  As always when you dig deeper it is all about money for the shills.

I have followed issues related to climate change and the net-zero energy transition for many years. The opinions expressed in this post do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

In her introduction, Slav expressed a concern that is common to many readers of Watts Up With That:

A couple of days ago, in a conversation with David Blackmon on X, I unthinkingly commented that we’ve reached peak idiocy in the transition narrative. David wisely reminded me that we keep getting proven wrong in this by the narrative constantly discovering new peaks to strive for and conquer. Alas, I couldn’t disagree.

In my work here I’ve mostly focused on calling out the climate indoctrinators in the media, in politics, and, occasionally, in schools. But there is an indoctrination channel I have so far steered clear of, for reasons of mental self-preservation. I get angry about things, you see, and I don’t really like being angry. When I saw this article on Rolling Stone a while ago, however, I got too angry to bother about disliking being angry.

The article is a symphony of climate propaganda done absolutely openly and eagerly, with an unshakeable conviction that amplifying climate catastrophism is the right thing to do. Through all means necessary.

She explains how this article is evidence of the incessant indoctrination of the masses regarding climate change.  Earlier the emphasis was on social justice but now there is a shift:

That was the social justice stage of the indoctrination drive. Now, we seem to have reached the next stage, which is all about climate change, a distillate of social justice issues, if you will, since every single problem we have today can be traced back to climate change by the eager narrative pushers. Why so eager, you might ask? Well, because there’s money and fame in it.               

The most revealing part of her article for me was her description of the organization called Good Energy.  She describes it thusly:

Said organisation exists with the sole purpose of making climate change a central topic in movies and TV shows. Because it’s important, of course. The most important topic ever. And these gracious people are there to guide film folk on the journey to internalising this so they can make more climate change-centric movies and TV shows.

Here’s an excerpt: “We aim to make it as easy as possible to weave climate into any aspect of a story. Applying the Climate Lens™ to your narrative can reveal complexities in character and setting, add conflict, and unlock touching, funny, and surprising storylines — all of them backed by climate science, psychology, and lived experiences.”

Incidentally, while helping writers, directors and producers “weave climate into any aspect of a story” and why not every single aspect of a story, they’d make some money from this because these consulting services are not free. Indoctrination is a mission but that doesn’t mean it can’t be a business at the same time, and how cool is that!

The Good Energy “Library of Experts” is interesting for a couple of reasons: the wide range of expertise disciplines that claim a link to their work and climate change and the number of individuals who loyal readers here might recognize like Dr. Peter Kalmus.  Slav goes on to expose a potential driver for their concern about climate change:

Speaking of money, the Daily Sceptic has done a great job in exposing the financial backing of Good Energy and similar organisations or shall I say formations because it certainly sounds more appropriate. You won’t be surprised to learn that this backing comes from climate obsessed billionaires. Bloomberg Philanthropies and the Sierra Club pop out among the list of backers, along with the Walton Family Foundation and One Earth.

She takes an optimistic view of this:

Sad as all this may be there is a silver lining and that silver lining lies in the fact that propaganda has never, ever produced quality art of any form or quality entertainment. Good art and good entertainment tell stories, invoke various emotions, and, if done really well, result in some form of catharsis.

Climate propaganda does not tell stories. It only aims to invoke one emotion and that’s fear. It hammers in a message disguised as a story that is so solid and unwieldy it defies interpretation. You can only swallow it whole. Or ridicule it, of course, because it is ridiculous.

Since climate propaganda in film – and in literature, too – is so rigid, it’s doomed to failure, just like the identity politics trend in literature. The reason for this is that while there may be many people with a mental age of four when it comes to discriminating between art and propaganda, there are many more who instinctively sense the difference and sooner or later shun the latter.

I hope she is correct.  I tend to be a bit more pessimistic because I think that the inevitable reality slap of the insane transition policies may occur after irreparable harm.  I encourage you to read all of her article and consider subscribing to her Substack.

Articles of Note October 15 2023

Sometimes I just don’t have time to put together an article about specific posts about the net-zero transition and climate change that I have read that I think are relevant.  This is a summary of posts that I think would be of interest to my readers.

I have been following the Climate Leadership & Community Protection Act (Climate Act)

Climate Act since it was first proposed and most of the articles described are related to it. I have devoted a lot of time to the Climate Act because I believe the ambitions for a zero-emissions economy embodied in the Climate Act outstrip available renewable technology such that the net-zero transition will do more harm than good.  .  The opinions expressed in this article do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

lawsuit is filed in New York.

Climate Act Background

The Climate Act established a New York “Net Zero” target (85% reduction and 15% offset of emissions) by 2050.  It includes an interim 2030 reduction target of a 40% reduction by 2030 and a requirement that all electricity generated be “zero-emissions” by 2040. The Climate Action Council is responsible for preparing the Scoping Plan that outlines how to “achieve the State’s bold clean energy and climate agenda.”  In brief, that plan is to electrify everything possible and power the electric grid with zero-emissions generating resources.  The Integration Analysis prepared by the New York State Energy Research and Development Authority (NYSERDA) and its consultants quantifies the impact of the electrification strategies.  That material was used to write a Draft Scoping Plan.  After a year-long review the Scoping Plan recommendations were finalized at the end of 2022.  In 2023 the Scoping Plan recommendations are supposed to be implemented through regulation and legislation. 

Videos of Note

For those of you who would rather watch a video than read about a topic I list a few interesting videos.  This video describes historic global temperatures and how ancient temperatures are estimated.  I think it does a good job describing a complicated subject.

This interview of Judith Curry by John Stossel is a good overview of the climate science hype.

Offshore Wind Costs

Renewable developments are struggling due to soaring interest rates and rising equipment and labor costs.  Reuters describes two “procured” projects that have been cancelled:

On Monday, Avangrid (AGR.N), a U.S. subsidiary of Spanish energy firm Iberdrola (IBE.MC), said it filed agreements with power companies in Connecticut to cancel power purchase agreements for Avangrid’s proposed Park City offshore wind project.

“One year ago, Avangrid was the first offshore wind developer in the United States to make public the unprecedented economic headwinds facing the industry,” Avangrid said in a release.

Those headwinds include “record inflation, supply chain disruptions, and sharp interest rate hikes, the aggregate impact of which rendered the Park City Wind project unfinanceable under its existing contracts,” Avangrid said.

Avangrid has said it planned to rebid the Park City project in future offshore wind solicitations.

Also over the past week, utility regulators in Massachusetts approved a proposal by SouthCoast Wind, another offshore wind developer, to pay local power companies a total of around $60 million to terminate contracts to provide about 1,200 MW of power.

Rich Ellenbogen described how the Offshore Wind Market is broken all over the world in an email.  First he mentioned this Avangrid project buyout of their contractual obligations.  He also pointed out that at a recent  UK wind auction, there were no bidders because the maximum selling price for the electricity was not high enough to justify the investment.  Their installation costs have risen by about 40% and the UK government did not factor that in to the allowable costs. He explains:

The article states that the wholesale price of electricity in the UK is £80 /Megawatt hour (MWh).  With an exchange rate of $1.23 per pound-sterling, that equates to  $98.40 per MWh.  The article also states that they would need £60  per MWh to make the wind farms profitable, or $73.80 per MWh.   However, according to this link, “the wholesale price for electricity in NY State in calendar year 2023 has increased from $24.57/MWh to $42.97/MWh over the last year.”, 47% lower than the wholesale cost in the UK  and 72% lower than what the wind installers say that they need to be profitable.

If the Wind installers can get $73.80/MWh installing wind farms in the UK but they can only get $42.97/MWh installing Wind farms here, 42% less,  while also having no ships to do the installation because of the Jones Act, where do you think that they will install the wind farms?  This is a global market.

The other way to look at this is that the energy from Offshore Wind will cost 72% more than what the ratepayers of NY State are currently paying. This is not a good economic model for the NY State rate payers.  72% increases are well outside of what surveys have said the public will tolerate.  Coupled with 15% increase in delivery costs from the utilities, the number of ratepayers, currently  1.2 million ratepayers that are $1.8 billion in arrears, will greatly increase making NY State even less affordable than it already is.

In New York, on October 12, 2023 the Public Service Commission turned down a request to address the same cost issues.  Times Union writer Rick Karlin summarizes:

At issue was a request in June by ACE NY, as well as Empire Offshore Wind LLC, Beacon Wind LLC, and Sunrise Wind LLC, which are putting up the offshore wind tower farms.

All told, the request, which was in the form of a filing before the PSC, represented four offshore wind projects totaling 4.2 gigawatts of power, five land-based wind farms worth 7.5 gigawatts and 81 large solar arrays.

All of these projects are underway but not completed. They have already been selected and are under contract with the New York State Energy Research and Development Authority, or NYSERDA, to help New York transition to a clean power grid, as called for in the Climate Leadership and Community Protection Act, approved by the state Legislature and signed into law in 2019.

Developer response suggests that “a number of planned projects will now be canceled, and their developers will try to rebid for a higher price at a later date — which will lead to delays in ushering in an era of green energy in New York”. Karlin also quotes Fred Zalcman, director of the New York Offshore Wind Alliance: “Today’s PSC decision denying relief to the portfolio of contracted offshore wind projects puts these projects in serious jeopardy,”

Francis Menton did an overview of the status of offshore wind projects that summarizes all the issues confronting offshore wind development.

Renewable Costs

Francis Menton also did an overview of renewable costs.

Another article in the Telegraph also addresses green energy costs.

Weather and Climate

The September edition of Climate Fact Check debunks ten bogus climate claims from last month.  There is a description of the analysis here

Electric Vehicles

Electric van maker on verge of bankruptcy

EV owners facing soaring insurance costs

How to Publish a High-Profile Climate Change Research Paper

Regular readers of this blog have noticed that there aren’t many articles in high-profile journals that suggest there are any issues with the narrative that climate change impacts are pervasive and catastrophic. Patrick T. Brown explains that “There is a formula for publishing climate change impacts research in the most prestigious and widely-read scientific journals. Following it brings professional success, but it comes at a cost to society.”  His formula explains part of the reason we see so little skeptical research in those journals.

The biggest topic on this blog is climate change and the proposed greenhouse gas emission reduction solutions.  From what I have seen the pressure to conform to the narrative described here is immense and it should be kept in mind by my readers.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

Background

Patrick T. Brown is a Ph.D. climate scientist. He is a Co-Director of the Climate and Energy Team at The Breakthrough Institute and is an adjunct faculty member (lecturer) in the Energy Policy and Climate Program at Johns Hopkins University. 

This month, he published a lead-author research paper in Nature on changes in extreme wildfire behavior under climate change. This is his third publication in Nature to go along with another in Nature’s climate-focused journal Nature Climate Change. He notes that “because Nature is one of the world’s most prestigious and visible scientific journals, getting published there is highly competitive, and it can significantly advance a researcher’s career.” 

His article is based on this publication experience, as well as through various failures to get research published in these journals.  He explains:

I have learned that there is a formula for success which I enumerate below in a four-item checklist. Unfortunately, the formula is more about shaping your research in specific ways to support pre-approved narratives than it is about generating useful knowledge for society.

Formula for Publishing Climate Changes Impact Research

Before describing his approach to get research published, he describes what is needed for useful scientific research.  He says:

It should prize curiosity, dispassionate objectivity, commitment to uncovering the truth, and practicality. However, scientific research is carried out by people, and people tend to subconsciously prioritize more immediate personal goals tied to meaning, status, and professional advancement. Aligning the personal incentives that researchers face with the production of the most valuable information for society is critical for the public to get what it deserves from the research that they largely fund, but the current reality falls far short of this ideal.

Brown explains that the “publish or perish” mentality in academic research is necessary.  In addition, it also matters “which journals you publish in”.  It turns out a “researcher’s career depends on their work being widely known and perceived as important.”  Because there is so much competition now it has become more important to publish in the highly regarded journals”  “while there has always been a tremendous premium placed on publishing in the most high-profile scientific journals – namely Nature and its rival Science – this has never been more true.”  As a result, “savvy researchers will tailor their studies to maximize their likelihood of being accepted.”  In his article he explains just how he did it.

First, he offers general advice:

My overarching advice for getting climate change impacts research published in a high-profile journal is to make sure that it supports the mainstream narrative that climate change impacts are pervasive and catastrophic, and the primary way to deal with them is not through practical adaptation measures but through policies that reduce greenhouse gas emissions. Specifically, the paper should try to check at least four boxes.

The first box to hit is that it is that “climate change impacts something of value is usually sufficient, and it is not typically necessary to show that the impact is large compared to other relevant influences.”  In order to do this there are tradeoffs:

In my recent Nature paper, we focused on the influence of climate change on extreme wildfire behavior but did not bother to quantify the influence of other obviously relevant factors like changes in human ignitions or the effect of poor forest management. I knew that considering these factors would make for a more realistic and useful analysis, but I also knew that it would muddy the waters and thus make the research more difficult to publish.

This type of framing, where the influence of climate change is unrealistically considered in isolation, is the norm for high-profile research papers. For example, in another recent influential Nature paper, they calculated that the two largest climate change impacts on society are deaths related to extreme heat and damage to agriculture. However, that paper does not mention that climate change is not the dominant driver for either one of these impacts: temperature-related deaths have been declining, and agricultural yields have been increasing for decades despite climate change.

The second box is to avoid discussion of anything that could reduce the impact of climate change:

This brings me to the second component of the formula, which is to ignore or at least downplay near-term practical actions that can negate the impact of climate change. If deaths related to outdoor temperatures are decreasing and agricultural yields are increasing, then it stands to reason that we can overcome some major negative effects of climate change. It is then valuable to study how we have been able to achieve success so that we can facilitate more of it. However, there is a strong taboo against studying or even mentioning successes since they are thought to undermine the motivation for emissions reductions. Identifying and focusing on problems rather than studying the effectiveness of solutions makes for more compelling abstracts that can be turned into headlines, but it is a major reason why high-profile research is not as useful to society as it could be.

His third component is to focus the presentation on alarm:

A third element of a high-profile climate change research paper is to focus on metrics that are not necessarily the most illuminating or relevant but rather are specifically designed to generate impressive numbers. In the case of our paper, we followed the common convention of focusing on changes in the risk of extreme wildfire events rather than simpler and more intuitive metrics like changes in the amount of acres burned. The sacrifice of clarity for the sake of more impressive numbers was probably necessary for it to get into Nature

Another related convention, which we also followed in our paper, is to report results corresponding to time periods that are not necessarily relevant to society but, again, get you the large numbers that justify the importance of your research. For example, it is standard practice to report societal climate change impacts associated with how much warming has occurred since the industrial revolution but to ignore or “hold constant” societal changes over that time. This makes little sense from a practical standpoint since societal changes have been much larger than climate changes since the 1800s. Similarly, it is conventional to report projections associated with distant future warming scenarios now thought to be implausible while ignoring potential changes in technology and resilience.

The good news is that Brown has transitioned out of a tenure-track academic position to one that does not require high-impact publications.  He explains a better approach than what is necessary to publish there:

A much more useful analysis for informing adaptation decisions would focus on changes in climate from the recent past that living people have actually experienced to the foreseeable future – the next several decades – while accounting for changes in technology and resilience. In the case of my recent Nature paper, this would mean considering the impact of climate change in conjunction with proposed reforms to forest management practices over the next several decades (research we are conducting now). This more practical kind of analysis is discouraged, however, because looking at changes in impacts over shorter time periods and in the context of other relevant factors reduces the calculated magnitude of the impact of climate change, and thus it appears to weaken the case for greenhouse gas emissions reductions. 

The final key to publication is presentation:

The final and perhaps most insidious element of producing a high-profile scientific research paper has to do with the clean, concise format of the presentation. These papers are required to be short, with only a few graphics, and thus there is little room for discussion of complicating factors or contradictory evidence. Furthermore, such discussions will weaken the argument that the findings deserve the high-profile venue. This incentivizes researchers to assemble and promote only the strongest evidence in favor of the case they are making. The data may be messy and contradictory, but that messiness has to be downplayed and the data shoehorned into a neat compelling story. This encouragement of confirmation bias is, of course, completely contradictory to the spirit of objective truth-seeking that many imagine animates the scientific enterprise.

Brown explains that despite the allowances he had to make to get it his work published there still is value in it:

All this is not to say that I think my recent Nature paper is useless. On the contrary, I do think it advances our understanding of climate change’s role in day-to-day wildfire behavior. It’s just that the process of customizing the research for a high-profile journal caused it to be less useful than it could have been. I am now conducting the version of this research that I believe adds much more practical value for real-world decisions. This entails using more straightforward metrics over more relevant timeframes to quantify the impact of climate change on wildfire behavior in the context of other important influences like changes in human ignition patterns and changes in forest management practices.

Brown explains his motivations for this post and his new plans:

But why did I follow the formula for producing a high-profile scientific research paper if I don’t believe it creates the most useful knowledge for society? I did it because I began this research as a new assistant professor facing pressure to establish myself in a new field and to maximize my prospects of securing respect from my peers, future funding, tenure, and ultimately a successful career. When I had previously attempted to deviate from the formula I outlined here, my papers were promptly rejected out of hand by the editors of high-profile journals without even going to peer review. Thus, I sacrificed value added for society in order to for the research to be compatible with the preferred narratives of the editors.

I have now transitioned out of a tenure-track academic position, and I feel liberated to direct my research toward questions that I think are more useful for society, even if they won’t make for clean stories that are published in high-profile venues. Stepping outside of the academy also removes the reservations I had to call out the perverse incentives facing scientific researchers because I no longer have to worry about the possibility of burning bridges and ruining my chances of ever publishing in a Nature journal again.

Brown concludes:

So what can shift the research landscape towards a more honest and useful treatment of climate change impacts? A good place to start would be for the editors of high-profile scientific journals to widen the scope of what is eligible for their stamp of approval and embrace their ostensible policies that encourage out-of-the-box thinking that challenges conventional wisdom. If they can open the door to research that places the impacts of climate change in the appropriate context, uses the most relevant metrics, gives serious treatment to societal changes in resilience, and is more honest about contradictory evidence, a wider array of valuable research will be published, and the career goals of researchers will be better aligned with the production of the most useful decision support for society.

My Conclusion

It is no wonder that all we hear from greenhouse gas emission reduction advocates is that climate change is an existential threat because the “science” says so.  Peeking around the curtain shows that the “science” has been perverted to reinforce and maintain this narrative.  I applaud Brown for giving insight into the way this is done.

This sums up a primary motivator for my work on this blog. New York’s planned transition to a net zero economy is a solution to a non-existent problem.  I have shown that New York GHG emissions are less than one half of one percent of global emissions and global emissions have been increasing on average by more than one half of one percent per year since 1990 so even if there was a problem our actions cannot make a difference.  Worse, the so-called solution has enormous reliability risks, eye-watering costs, and under evaluated environmental impacts.  There is no redeeming virtues to New York’s net-zero transition plan.

Articles of Note Relevant to the Climate Act

I have a “to-do” list of posts and analyses that I want to do.  Some items on the list are over a month old.  Rather than adding to the list with articles about specific posts that I have read that I think are relevant, this post describes articles that caught my attention.

I have been following the Climate Leadership & Community Protection Act (Climate Act) since it was first proposed and most of this blog articles are related to it. I have devoted a lot of time to the Climate Act because I believe the ambitions for a zero-emissions economy embodied in the Climate Act outstrip available renewable technology such that the net-zero transition will do more harm than good.  .  The opinions expressed in this article do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

Weather and Climate

It seems that every day there is at least one article that claims that a recent extreme weather event is related to climate change caused by humans.  Roger Pielke, Jr was prompted to write a post after the Lahaina, Maui fire disaster link to climate change.  The post Signal and Noise that addresses this issue and includes the following lessons:

Just because the signal of climate change for particular variables cannot (yet) be detected in the context of historical variability does not mean that climate change is not real or important, and in many, if not most cases, a lack of signal is to be expected.

Natural variability is real and significant. It does not mean that climate change is not real or important, but that detecting signals is often difficult even when climate is changing and there is always a risk of erroneously detecting signals where none is present.

He concludes:

The challenges of detection and attribution should tell us that both adaptation and mitigation policies must be built upon a foundation that involves justifications for action that are much broader than climate change alone.

So far, climate advocates have sought to shape perceptions of science to support a climate-change-is-everything agenda. We will have a lot more success if we instead shape policy to align with what science actually says.

My pragmatic take concerns the tradeoff between the resources devoted to climate change mitigation relative to extreme weather adaptation.  I was involved with emergency planning at a nuclear plant so I have experience with this kind of planning.  From what I have seen the emergency planning for Lahaina relative to an identified wildfire threat was criminally negligent.  Blaming climate change absolves blame for the guilty parties.  Trying to mitigate climate change displaces resources that would be better employed to address existing weather problems. 

I wrote the preceding paragraph before I read that Bjorn Lomborg said that politicians were blaming climate change for disasters like the wildfires on Maui to duck “responsibility” for “failures” in addressing them.  It is reassuring that my thoughts agree with him.

One final note about the Maui fires.  Professor Cliff Mass did an excellent job explaining what really happened: a high amplitude atmospheric wave forced by strong winds interacting with the mountains of northwest Maui.  He explains that:

It did not matter whether the grass or light vegetation were wet or dry the days or weeks before:  this extraordinary atmospheric animal would ensure they were dry enough to burn.   Prior dry conditions during the weeks before were immaterial.

With respect to the current state of the climate, Judith Curry, Jim Johnstone, and Mark Jelinek presented a “deep dive into the causes of the unusual weather/climate during 2023.  People are blaming fossil-fueled warming and El Nino, and now the Hunga-Tonga eruption and the change in ship fuels.  But the real story is more complicated.”  They conclude, among other things:

The exceptionally warm global temperature in 2023 is part of a trend of warming since 2015 that is associated primarily with greater absorption of solar radiation in the earth-atmosphere system.  This increase in absorbed solar radiation is driven by a slow decline in springtime snow extent, but primary by a reduction in reflection from the atmosphere driven by reduced cloudiness and to a lesser extent a reduction in atmospheric aerosol.  Any increase in the greenhouse effect from increasing CO2 (which impacts the longwave radiation budget) is lost in the noise.

Climate Emergency

Alex Epstein writes that it would be inappropriate for the Biden Administraiton to declare a Climate Emergency.  He argues that there is no emergency because rising CO2 levels are:

  1. Not dire: Humans are safer from climate than ever.
  2. Not temporary: They will rise for decades.
  3. Not in our control: We emit 1/7 of CO2—and falling.

He makes the point that a government “emergency” declaration is a temporary increase in power that should only be used if a problem meets three criteria:

  • Dire: Unusually deadly
  • Temporary: Of limited duration
  • In our control: Actually solvable by our government

His conclusion is that none of the conditions necessary to declare a climate emergency have been met and goes on to support his arguments in detail.

Climate Act and Electric Vehicles

The Climate Act is a political animal.  While I focus primarily on the environmental and energy-related issues associated with GHG emission reductions, there is a social justice aka “Green New Deal”  component that is a primary interest of many of the Act’s proponents.  The contradiction between advocating for zero GHG emissions that will markedly increase energy prices and risk electric reliability that will impact those that can least afford to deal with the problems the most while at the same time demanding investments in disadvantaged communities has always seemed incongruous to me.  No where is this tradeoff more stark than in the push for electric vehicles.

The CalMatters post Will California’s push on electric vehicles reduce inequality — or deepen it? touched on the issues that concern me.  The post described a CalMatters panel discussion that addressed the question whether California can make sure the electric vehicle revolution isn’t just for the wealthy few. 

The post noted:

While bringing down the cost of EVs is crucial, so is the availability of chargers. And that is something of a chicken-and-egg proposition.

Some on the panel — moderated by CalMatters’ climate reporter Alejandro Lazo — called for building out the charging infrastructure in disadvantaged communities in advance, especially residential chargers.

  • Steve Douglas, vice president of energy and environment for the Alliance for Automotive Innovation: “You can’t ask low-income residents to spend an hour, three hours, six hours away from their families, every week, just to charge their car, while affluent people pull in, plug in and wake up to a full car.”

But others said without enough EV owners in a neighborhood, it’s a recipe for vandalism and disuse. 

  • Ted Lamm, senior research fellow in the climate program at UC Berkeley’s Center for Law, Energy, & the Environment: “When charging is installed in an area where there is no demand for the vehicles and no local desire to use them, it’s this sort of dead infrastructure. It has no use to the local population and local community, and so it is more likely to be subjected to vandalism, or just disuse and disrepair.”

Montana Court Climate Decision

A group of young people in Montana won a landmark lawsuit on August 14, when a judge ruled as unconstitutional the state’s failure to consider climate change when approving fossil fuel projects. While this has been hailed as a turning point by the usual suspects the reality is different.

David Wojick explained why the court decision was not a big deal.  He writes:

Much ado is being made from the supposed win of a kid’s climate lawsuit in Montana. The alarmists call it a victory, the skeptics a tragedy, but it is neither. What was won is almost funny, while the big ask was in fact denied. The climate kids won a little, but lost a lot.

On the win side the judge merely ruled that the Montana law forbidding consideration of GHG emissions in permitting was unconstitutional. How it is considered is up to the agency or legislature. This need not slow down or stop any project.

The Montana constitution says there is a right to a healthful environment. Alarmism says emissions are harmful which all Courts to date have bought, including this one. So given the possible harm, one cannot simply ignore emissions which the law said to do. Hence the decision to kill the law.

Gregory Wrightstone debunked the claim that Montana is a “major emitter of greenhouse gas emissions in the world” and the state’s emissions “have been proven to be a substantial factor” in affecting the climate.   He explains:

Montana’s COemissions are 0.6% of the total U.S. emissions. If Montana had gone to zero emissions of CO2 in 2010, it would only avert 0.0004 degree Fahrenheit of greenhouse warming by 2050 and 0.001 degree by 2100, according to the MAGICC simulator, a tool created by a consortium of climate research institutes including the National Center for Atmospheric Research. These numbers are far below our ability to even measure and certainly not the “substantial factor” as claimed.

New York’s emissions are a greater proportion of total U.S. emissions but I have found they are not high enough to measure and are also not a “substantial factor”.

The Montana Attorney General’s office considered arguing against the plaintiff’s witnesses about the alleged harms of climate change.  They retained Dr. Judith Curry to prepare evidence but ended up not using it.  She explained the inside story, her written expert report, and why she was not asked to testify at the trial.  I found it fascinating and there is plenty of ammunition included to debunk many of the arguments used by proponents of the net-zero transition.    This will be useful when the inevitable lawsuit is filed in New York.