There is No Existential Threat from Climate Change

Anthony Watts has summed up my problems with claims that climate change is an existential threat in a post entitled “Is Climate Change Real? Short Answer: Yes — But It’s Complicated.”  This post reproduces the article with my annotated comments.

I am convinced that implementation of the New York Climate Leadership & Community Protection Act (Climate Act) net-zero mandates will do more harm than good if the future electric system relies only on wind, solar, and energy storage because of reliability and affordability risks.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 500 articles about New York’s net-zero transition.  I also am an air pollution meteorologist with bachelor and master of science degrees in meteorology and was a Certified Consulting Meteorologist before I retired with nearly 50 years of experience. The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Overview

The Climate Act established a New York “Net Zero” target (85% reduction in GHG emissions and 15% offset of emissions) by 2050.  The Climate Leadership & Community Protection Act Section 1. Legislative findings and declaration, subsection 3 defines the alleged threat and goal: “Action undertaken by New York to reduce greenhouse emissions will have an impact on global greenhouse gas emissions and the rate of climate change.”  I have tried to argue against this point many times, but I think Watts has provided a concise well-documented case that the basic premise that New York can have an effect on the rate of climate change is misplaced.

Is Climate Change Real?

Anthony Watts prepared the post addressing this question because he gets asked this a lot.  His response to the question shows that New York does not need to rush to comply with the aspirational Climate Act schedule and targets set by politicians during the always rushed and hectic New York budget process.  Watts provides a simple primer that makes five key points.  Note that all the bold passages in the following quotes were highlighted by Watts.

1. The Basics: Climate Does Change

His first point cuts to the nub of the problem.  Climate change is real and is always occurring.  That makes it easy for everyone to have an impression that the climate isn’t what it used to be.

First, let’s be clear — climate change is real in the literal sense. The Earth’s climate has been changing for billions of years. We have geological records showing periods that were much warmer (like the Eocene, with crocodiles in the Arctic), and much colder (like the Ice Ages that covered North America in glaciers).

Even more recently, we have the Holocene Climate Optimum, significantly warmer than present day:

Watts explains that there is a nuance to the fact that the climate is changing.  Those nuances are being ignored as he notes:

So, yes — the climate changes, and it always has. The debate isn’t about whether it changes, but whyhow fast, and how much humans are influencing it today. The debate is also about how accurately we are able to detect temperature change, plus the overreliance on climate models to predict the future rather than actual data.

2. What the “Consensus” Says — and Where It Falls Short

Folks like me who publicly decry the claim of an existential threat must confront the consensus argument he describes. 

The mainstream position (IPCC, NOAA, NASA, etc.) holds that recent warming — roughly 1.1°C since the late 1800s — is largely due to increased CO₂ from human activity, mainly fossil fuels.

But here’s the rub: this view is heavily dependent on climate models, which are notoriously uncertain.

The fact that the extreme risks claimed are based on models is frustrating because I know the limitations of model projections and they never get mentioned in the mainstream coverage of climate change. The only thing I would add to his remarks is that he could have included many more issues.

As someone with a meteorology background, I can tell you models struggle with cloud feedbacksocean cyclessolar variability, and regional forecasts — all of which are crucial to understanding climate.

When models are run backward, they often fail to replicate past climate variability accurately — like the Medieval Warm Period or the Little Ice Age — unless they’re tuned heavily. That calls into question their reliability for long-term projections.

3. Natural Variability: The Elephant in the Room

As Watts explains, natural variability is not understood well.  I think the thing to keep in mind is that this variability is driven in large part by the patterns of the upper air steering currents like the jet stream.  The massive flooding due to Helene in western North Carolina was caused by a rare weather pattern that stalled the storm in one place.  A similar pattern occurred in 1916 so today’s level of CO2 and warming were not the cause.  Unfortunately, we don’t know what caused that pattern or if it was just normal variability.  Watts describes the variability of observed warming:

A lot of warming in the 20th century happened before CO₂ rose sharply post-WWII. For example:

  • The warming from 1910 to 1940 occurred with much lower CO₂ levels.
  • Then there was a cooling trend from the 1940s to 1970s, despite rising CO₂ emissions during that time period.

Clearly, natural factors — like solar cycles, ocean oscillations (PDO, AMO), volcanic activity, and cloud dynamics — are still in play and possibly underestimated in mainstream assessments.

Keep in mind that the consensus says that the recent warming was caused by GHG emissions, but I don’t see any big difference between that warming and the previous one that was “natural”.  We know there are natural factors in play but we don’t understand them well enough to be able to discern what the impact of the greenhouse effect is relative to them.

4. The CO₂ Connection: Overstated?

The second complicating factor is that the greenhouse effect is real and increased CO2 in the atmosphere should also increase warming.  However, as Watts explains even that fact is conditional on at least one factor rarely mentioned.

CO₂ is a greenhouse gas, no question. But its effect on temperature is logarithmic — meaning, the more CO₂ you add, the less warming you get per unit. The first 100 ppm has the biggest impact, and we’re well past that as seen in the figure below.

Moreover, satellite data from UAH and RSS shows a slower warming trend than surface datasets like HadCRUT or GISS. That discrepancy raises questions about data adjustments, urban heat island effects, and instrument biases.

I addressed a couple of warming trend issues in two recent articles about measuring temperature trends here and here.  This primer just touches the surfaces of isues.

5. Are We in a Crisis?

Ultimately the only reason we are being forced to endure the insane transition policies that defy physics, math, and economics is the existential threat.  Watts points out problems with that claim. 

Even if we accept that humans are influencing climate, the notion that we’re in an “existential crisis” is unproven. Extreme weather trends (hurricanes, tornadoes, droughts) don’t show clear worsening patterns once you account for improved detection and population growth in vulnerable areas such as coastal developments.

The Intergovernmental Panel on Climate Change (IPCC) agrees, suggesting a “low confidence” in many current and future weather events being affected by climate change. The “existential crisis” view is heavily dependent on climate model projections, which are notoriously uncertain and refuted by data.

Sea level is rising — but at a slow, linear pace of about 3 mm/year. That’s about 12 inches per century, similar to what’s been observed since before industrial CO₂ emissions.

Away from the bluster and hype in the real-world evidence is clear that even if there is a potential for massive impacts due to climate change, the pace observed is slow and not accelerating.  That means that we have time to consider and modify the politically motivated schedule of the New York Climate Act.

Bottom Line

I cannot conclude this post any better than Anthony Watts did in his bottom line.

Yes, the climate is changing. It always has. The idea that global climate must be unchanging is simply wrongheaded. The real issue is how much of today’s change is due to human activity, how reliable our predictions are, and whether proposed policy responses are justified — or likely to do more harm than good.

At Watts Up With That, we’ve been pointing out for years that this issue is riddled with confirmation bias, model overconfidence, and selective reporting. There is no justification for shutting down economies or reshaping civilization based on the incomplete science of climate change.

So yes, climate change is real, but no, it’s not a crisis.

Temperature Trend Measurement Uncertainty

Late last year I published an article that described the difficulties involved with a fundamental aspect of the climate change debate – measuring global temperature trends.   This article describes an analysis of a data set that compares two different ways to calculate the daily temperatures used to determine global temperature trends.  Ray Sanders reproduced Stephen Connolly’s description an analysis that shows how temperature measurement techniques affect trend interpretation. 

The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Background

My fifty-odd year career as an air pollution meteorologist in the electric utility sector has always focused on meteorological and pollution measurements.  Common measurement challenges are properly characterizing the parameter in question, measuring it in such a way that the location of the sensor does not affect the results, and, when operating a monitoring system, verifying the data and checking for trends.  On the face of it, that is easy.  In reality, it is much more difficult than commonly supposed.

I prepared the previous article to highlight recognized instrumental and observational biases in the temperature measurements.  One problem is measurement methodology.  The longest running instrumental temperature record is the Central England Temperature (CET) series that extends back to 1659.  In the United States temperature data are available back to the mid 1800’s. In both cases the equipment and observing methodology changed and that can affect the trend.  Too frequently, when observing methods change there is no period of coincident measurements that would enable an analysis of potential effects on the trends.

Only recently have computerized data acquisition systems been employed that do not rely on manual observations, and even now many locations still rely on an observer.  For locations where temperature records are still manually collected, observers note the maximum and minimum temperature recorded on an instrument that measures both values daily.  A bias can be introduced if the time of observation changes.  If observations are taken and the max-min thermometers are reset near the time of daily highs or lows, then an extreme event can affect two days and the resulting long-term averages.  Connolly’s work addresses another bias of this methodology.

Uncertainty Caused by Averaging Methodology

The issue that Stephen Connolly addressed in his work was the bias introduced when a station converts from manual measurements of maximum and minimum temperatures to a system with a data acquisition system. Typically, those data acquisiton systems make observations every second, compute and save minute averages, and then calculate and report hourly and daily averages. 

Ray Sanders explained that he came across Stephen Connolly’s analysis of temperature averages based on data from the Valentia weather station on the south west of the Republic of Ireland. He asked Stephen if he could refer to his work, to which he agreed on the condition he duly credited him. So by way of a second-hand proxy “guest post” I have reproduced Stephen’s unadulterated X post at the end of this article.  I offer my observations on key parts of his work in the following.

I highly recommend Connolly’s article because he does a very good job explaining how sampling affects averages.  He describes the history of temperature measurements in a more comprehensive way than I did in my earlier post.  He explains that the daily average temperature reported from a manual observation station calculated as the average of the maximum and minimum temperature (Taxn) is not the same as an average of equally spaced observations over a 24-hour period (Tavg).  Using a couple of examples, he illustrates the uncertainties introduced because of the sampling differences. 

Connolly goes on to explain that:

In 1944 Met Éireann did something a bit unusual, they started measuring the temperature every hour. Rain or shine, sleet or snow, the diligent staff of Met Éireann would go out to the weather station and record the temperature. Between January 1944 and April 2012 when the station was replaced with an automated station only 2 hours were missed.

The data enabled Connolly to compare the two techniques to calculate the daily average temperature.  In his first graph he plots the difference between the two techniques as blue points. Overlaid is the 1 year rolling average as a red line. He states that Tavg is greater than Taxn in Valentia on average by 0.17oC (std deviation 0.53, N=29339, min=-2.20, max=3.20).

Connolly plots the difference between the two averaging approach and notes that:

If we just look at the rolling average, you can see that the relationship is not constant, for example in the 1970’s the average temperature was on average 0.35ºC warmer than the Meteorological estimate, while in the late 1940’s, 1990’s and 2000’s there were occasions where the Meteorological estimate was slightly higher than the actual average daily temperature.

He goes on:

It’s important to highlight that this multi-year variability is both unexpected and intriguing, particularly for those examining temperature anomalies. However, putting aside the multi-year variability, by squeezing nearly 30,000 data points onto the x-axis we may have hidden a potential explanation why the blue points typically show a spread of about ±1ºC… Is the ±1°C spread seasonal variability?

The shortest day of the year in Valentia is December 21st when the day lasts for approximately 7h55m. The longest day of the year is June 21st when the day lasts for 16h57m. On the shortest day of the year there is little time for the sun to heat up and most of the time it is dark and we expect heat to be lost. So we expect the average temperature to be closer to the minimum temperature during the winter than during the summer.

I found this line of reasoning interesting:

We can check the seasonal effects in the difference between Tavg and Taxn by looking at a time dependent correlation. As not everyone will be familiar with this kind of analysis, I will start by showing you the time dependent correlation of Tavg with itself in the following graph.

The x-axis is how many days there are between measurements and the y-axis is the Pearson correlation coefficient, known as r, which measures how similar measurements are averages across all the data. A Pearson correlation coefficient of +1 means that the changes in one are exactly matched by changes in the other, a coefficient of -1 means that the changes are exactly opposite and a correlation coefficient of 0 means that the two variables have no relationship to each other.  The first point on the x-axis is for 1 day separation between the average temperature measurements.

When I was in graduate school, a half century ago weather forecasting performance was judged relative to two no-skill approaches we called persistence and climatology.  Connolly explains that persistence is assuming that “Tomorrow’s weather will be basically the same as today’s”.  This graph shows that the approach is approximately 82% accurate.

The graph also illustrates the accuracy of the second no-skill forecast – climatology.  In other words the climatology forecast for the average temperature is simply the average for the date.  At a year separation the r value of 0.67 days that 44% of today’s average temperature can be explained as seasonal for this time of year. What this means is that actually the persistence forecast is only explaining 38% better than the climatological forecast

Connolly notes that the maximum and minimum temperatures behave the same and concludes that the above graph basically tells us what to expect when something is strongly seasonal.

Connolly goes on to ask what happens when we plot the time-dependent correlation of Tavg-Taxn? He shows the results in the following graph.

The 1 day correlation is 0.19, this tells us that approximately 4% of today’s correction factor between Tavg and Taxn can be predicted if we know yesterday’s correction factor. The seasonality is even worse, the 6 month correlation coefficient is -0.02 and the 1 year correlation coefficient is +0.07.

He points out that this answers the question whether this is seasonal variability and concludes that the ±1°C spread is not seasonal variability.  The important point of this work is that this means that if we only know daily average temperature based on the average of the maximum and the minimum temperature then comparison to the average measured using a data acquisition system the two methodologies could be anywhere between ±1°C different

He provides another graph to illustrate this.

The x-axis is Tavg and the y-axis is Taxn. Now obviously when the average daily temperature is higher, the average of the minimum and maximum temperatures is also higher and so we get a straight line of slope 1, but the thickness of the line represents the uncertainty of the relationship, so if we know Taxn is say 15°C then from this graph we can say that Tavg is probably between 13.5°C and 16.5°C.

Here is the important point:

Now because most weather stations were not recording hourly until recently, most of our historical temperature data is the Taxn form and not the Tavg. That means that if Valentia is representative then the past temperature records are only good to ±1°C. If somebody tells you that the average temperature in Valentia on the 31st of May 1872 was 11.7°C, the reality is that we just do not know. It’s 95% likely to have been somewhere between 10.6ºC and 12.7ºC.

He ends his analysis with another graph

In this last graph the blue points show the average Taxn of each year at Valentia since 1873 with vertical error bars showing the 95% confidence interval. The red points show the average Tavg for each year starting from 1944 with error bars showing the annual variation. The blue poking out from under the red shows the difference, even on the scale of a yearly average between the Meteorologist’s estimate of average temperature and the actual average temperature.

Discussion

Connolly explains:

Valentia Observatory is one of the best weather stations globally. the switch to automated stations in the 1990s, we can now get precise average temperatures.  Thanks to the meticulous efforts of past and present staff of Valentia Observatory and Met Éireann, we have 80 years of data which allows comparison of the old estimation methods with actual averages.

The takeaway from Connolly’s evaluation of these data is that out “historical temperature records are far less accurate than we once believed.”

I second Sanders acknowledgements of the work done by Connolly:

I would like to thank Stephen for allowing me to refer to his excellent research. Whatever one’s views are on the validity of the historic temperature record of the UK, this evaluation has again highlighted one area of many where there are significant questions to be asked regarding long term accuracy.

Conclusion

I would like to thank Stephen for allowing the posting of this excellent research.  One fundamental truth I have divined in my long career is that observed data are always more trustworthy than any model projection.  However, there are always limitations to the observed data that become important when trying to estimate a trend. 

I think these results are important because they highlight an uncertainty that climate catastrophists ignore.  I will concede that average temperatures are likely warming but the uncertainty around how much is within the observational uncertainty.  In other words, the claims the magnitude of the observed warming is not known well.  The science is not settled on the amount of warming observed.

Climate Science New Year Rant

As I age, I am becoming less willing to play along with the Climate Leadership & Community Protection Act (Climate Act) narrative that there is an existential threat to mankind from man-made climate change and that an energy system that relies on wind, solar, and energy storage can solve that threat.  One aspect of playing along is to appease supporters by accepting that there is a reason to reduce GHG emissions and agreeing that solar and wind resources should be part of the future electric energy system.  Ron Clutz’s recent article “Lacking data, climate models rely on guesses” included information that spurred this article.

I am convinced that implementation of the New York Climate Act net-zero mandates will do more harm than good if the future electric system relies only on wind, solar, and energy storage because of reliability and affordability risks.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 480 articles about New York’s net-zero transition.  The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

The Climate Act established a New York “Net Zero” target (85% reduction in GHG emissions and 15% offset of emissions) by 2050.  The authors of the Climate Act believed that “our State could rapidly move away from fossil fuels and instead be fueled completely by the power of the wind, the sun, and hydro” and that “it that it could be done completely with technologies available at that time (a decade ago)”.  In my opinion we need a feasibility analysis to determine if this presumption is correct.  This article addresses the questions: should we be trying to reduce GHG emissions in hopes of affecting the climate and even if we accept that decarbonization is a worthy goal should we try to rely on wind and solar.

Is There an Existential Threat?

Keep in mind that climate models provide all the evidence that there is an existential threat.  Despite the constant claims in the main stream media, attributing extreme weather events to man-made climate change is a claim no one without a vested interest in that answer is willing to make.  Ron Clutz’s recent article “Lacking data, climate models rely on guesses” described the response to a question about climate model accuracy by Dr. Keith Minor.  The following is parts of the summary from Ron’s post.

A recent question was posed on  Quora: Say there are merely 15 variables involved in predicting global climate change. Assume climatologists have mastered each variable to a near perfect accuracy of 95%. How accurate would a climate model built on this simplified system be?  Keith Minor has a PhD in organic chemistry, PhD in Geology, and PhD in Geology & Paleontology from The University of Texas at Austin. 

Minor responded with bolds by Clutz:

I like the answers to this question, and Matthew stole my thunder on the climate models not being statistical models. If we take the question and it’s assumptions at face value, one unsolvable overriding problem, and a limit to developing an accurate climate model that is rarely ever addressed, is the sampling issue. Knowing 15 parameters to 99+% accuracy won’t solve this problem.

The modeling of the atmosphere is a boundary condition problem. No, I’m not talking about frontal boundaries. Thermodynamic systems are boundary condition problems, meaning that the evolution of a thermodynamic system is dependent not only on the conditions at t > 0 (is the system under adiabatic conditions, isothermal conditions, do these conditions change during the process, etc.?), but also on the initial conditions at t = 0 (sec, whatever). Knowing almost nothing about what even a fraction of a fraction of the molecules in the atmosphere are doing at t = 0 or at t > 0 is a huge problem to accurately predicting what the atmosphere will do in the near or far future.

These problems boil down to the challenge of measuring the meteorological parameters necessary to initiate weather and climate models.   The reference to t = 0 relates to the start time of the model. Minor explains that there are many sources of variability within the models themselves too including:

  • The inability of the models to handle water (the most important greenhouse gas in the atmosphere, not CO2) and processes related to it;  e.g., models still can’t handle the formation and non-formation of clouds;
  • The non-linearity of thermodynamic properties of matter (which seem to be an afterthought, especially in popular discussions regarding the roles that CO2 plays in the atmosphere and biosphere), and
  • The always-present sampling problem.

Minor goes on to describe how these issues affect weather forecasting and how more sampling could improve certain forecasts.  He concludes:

So back to the Quora question, with regard to a cost-effective (cost-effect is the operational term) climate model or models (say an ensemble model) that would “verify” say 50 years from now, the sampling issue is ever present, and likely cost-prohibitive at the level needed to make the sampling statistically significant. And will the climatologist be around in 50 years to be “hoisted with their own petard” when the climate model is proven to be wrong? The absence of accountability is the other problem with these long-range models into which many put so much faith.

Clutz also references a quote by esteemed climate scientist Richard Lindzen that I think sums up whether we should rely on climate models to make the policy decision to transition away from fossil fuels.   In a presentation (here) Lindzen states:

I haven’t spent much time on the details of the science, but there is one thing that should spark skepticism in any intelligent reader. The system we are looking at consists of two turbulent fluids interacting with each other. They are on a rotating planet that is differentially heated by the sun. A vital constituent of the atmospheric component is water in the liquid, solid and vapor phases, and the changes in phase have vast energetic ramifications. The energy budget of this system involves the absorption and re-emission of about 200 watts per square meter. Doubling CO2 involves a 2% perturbation to this budget. So do minor changes in clouds and other features, and such changes are common. In this complex multifactor system, what is the likelihood of the climate (which, itself, consists in many variables and not just globally averaged temperature anomaly) is controlled by this 2% perturbation in a single variable? Believing this is pretty close to believing in magic. Instead, you are told that it is believing in ‘science.’ Such a claim should be a tip-off that something is amiss. After all, science is a mode of inquiry rather than a belief structure.

Can We Transition Away from Fossil Fuels

A recurrent theme at this blog is that the electric energy system absolutely needs new technology to achieve decarbonization.  Responsible New York agencies all agree that new Dispatchable Emissions-Free Resource (DEFR) technologies are needed to make a solar and wind-reliant electric energy system work reliably.  Because DEFR is needed and because we don’t know what should be used, I think that the Climate Act schedule needs to be reconsidered or at least paused.

I believe the only likely viable DEFR backup technology is nuclear generation because it is the only candidate resource that is technologically ready, can be expanded as needed, and does not suffer from limitations of the Second Law of Thermodynamics. I do concede that there are financial issues that need to be addressed.  The bigger issue is that DEFR is needed as a backup during extended periods of low wind and solar resource availability, but nuclear power is best used for baseload energy.  I estimate that 24 GW of nuclear could replace 178 GW of wind, water, battery storage.  Developing nuclear eliminates the need for a huge DEFR backup resource and massive buildout of wind turbines and solar panels sprawling over the state’s lands and water.  Until the New York Energy Plan settles on a DEFR solution the only rational thing to do is to pause the implementation process.

Lest you think that I am the only skeptical voice about the viability of an electrical energy transition relying on wind and solar resources I list some recent articles below.

Thomas Shepstone describes a fact sheet from the Empowerment Alliance that outlines why the electric grid is headed to a crisis:

America’s Electrical Grid Crisis is on the brink of a crisis that no one is talking about. Government mandates and pledges from utilities to achieve “net zero” emissions by 2050 or sooner have led to the closure of traditional power plants fueled by coal, natural gas and nuclear energy.

However, the wind and solar energy that is supposed to replace these sources is intermittent, unreliable and artificially supported by government subsidies. “Net zero” policies may sound nice on paper but they are not ready for practice in the real world.

In fact, the crisis may have already begun. A recent capacity auction by the largest U.S. electrical grid operator resulted in an over 800% price increase for these very reasons. And, everyday Americans are going to pay the price through higher bills for less reliable electricity.

  • One study of electricity plans in the Midwest found that, “Of the 38 major investor-owned utilities spanning the Great Lakes region, 32 are pledged to net zero by 2050 or sooner. Of the seven states analyzed in this report, three have net zero mandates by law, one has net zero mandates through regulation and the other three have no net zero mandates at the state level.”
  • “The Midcontinent Independent Systems Operator, the grid operator for much of the Midwest, projects that by 2032, none of the five Great Lakes states in its territory will have enough electricity capacity to meet even the most conservative projection of demand load.”
  • “Wind and solar cannot be relied on as a one-for-one replacement of existing generation sources, like coal, natural gas and nuclear. If the grid relies on forms of generation that are uncontrollable and unreliable, it must also maintain backup sources that are controllable and reliable. Because wind and solar production can fall to near zero at times, utilities may need to maintain up to another grid’s worth of generation capacity.”

Source:

Joshua Antonini and Jason Hayes, “Shorting The Shorting The Great Lakes Grid: Great Lakes Grid: How Net Zero Plans Risk Energy Reliability,” Mackinac Center for Public Policy, 2024

Thomas Shepstone describes a report by the Fraser Institute regarding the real costs of electricity produced from solar and wind facilities, compared to other energy sources.  Tom highlights the money paragraphs with his emphasis added:

Often, when proponents claim that wind and solar sources are cheaper than fossil fuels, they ignore [backup energy] costs. A recent study published in Energy, a peer-reviewed energy and engineering journal, found that—after accounting for backup, energy storage and associated indirect costs—solar power costs skyrocket from US$36 per megawatt hour (MWh) to as high as US$1,548 and wind generation costs increase from US$40 to up to US$504 per MWh.

Which is why when governments phase out fossil fuels to expand the role of renewable sources in the electricity grid, electricity become more expensive. In fact, a study by University of Chicago economists showed that between 1990 and 2015, U.S. states that mandated minimum renewable power sources experienced significant electricity price increases after accounting for backup infrastructure and other costs. Specifically, in those states electricity prices increased by an average of 11 per cent, costing consumers an additional $30 billion annually. The study also found that electricity prices grew more expensive over time, and by the twelfth year, electricity prices were 17 per cent higher (on average).

Finally, Chris Martz compares the impacts of wind and solar vs. nuclear power. I should note that he is not including DEFR support in his estimates. He concludes:

In order to power the same number of homes that a 1,000 MW nuclear power plant can, it would require either:

• For 𝐬𝐨𝐥𝐚𝐫 𝐏𝐕: Approximately 4,000 MW of installed power (equivalent to four nuclear facilities) and 24,000 acres of land (some 37.5 × as much land area than a nuclear plant).

• For 𝐨𝐧𝐬𝐡𝐨𝐫𝐞 𝐰𝐢𝐧𝐝: Approximately 2,800 MW of installed power (equivalent to 2.8 nuclear facilities) and 89,600 acres of land (some 140 × as much land area than a nuclear power generation station).

But, I should caution you that these estimates are in fact conservative. Why? Because they do 𝒏𝒐𝒕 take into consideration land area required for battery storage due to their intermittency in overcast sky conditions, low wind speed and/or overnight.

Conclusion

It is terrifying that the rationale and proposed solution to a New York policy that could cost hundreds of billions is based on fantasy.  Richard Lindzen describes the made-up rationale: “In this complex multifactor system, what is the likelihood of the climate (which, itself, consists in many variables and not just globally averaged temperature anomaly) is controlled by this 2% perturbation in a single variable? Believing this is pretty close to believing in magic.”  Keith Minor explains that even if this perturbation was the climate change driver that we can never provide enough data to to ensure that a model could accurately project the impacts.  The myth that wind and solar can replace fossil fuels on the schedule mandated by the Climate Act is dependent upon the fantastical notion that a resource that does not exist can be developed, tested, permitted, and deployed by 2040.

I can only conclude that allowing politicians to set energy policy will turn out to be an unmitigated disaster.

Measuring Global Temperature Trends

The subject of global warming has been a primary focus of this blog since the beginning.  I think it is obvious that I am skeptical of the narrative that there is an existential threat of climate change.  This post describes one of the reasons for my skepticism – the unrecognized difficulty of measuring long-term temperature trends. 

The opinions expressed in this article do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Background

My fifty-odd year career as an air pollution meteorologist in the electric utility sector has always focused on meteorological and pollution measurements.  Common measurement challenges are properly characterizing the parameter in question, measuring it in such a way that the location of the sensor does not affect the results,  and, when operating a monitoring system, verifying the data and checking for trends.  On the face of it, that is easy.  In reality, it is much more difficult than commonly supposed.

According to the Britannica website global warming is “the phenomenon of increasing average air temperatures near the surface of Earth over the past one to two centuries” and states “the best estimate of the increase in global average surface temperature between 1850 and 2019 was 1.07 °C (1.9 °F).” This post will only address the how it is warming and not the why it is warming.  However, keep in mind that the interest in global temperature trends is related to the supposition that mankind has added greenhouse gases to the atmosphere that impact temperature trends everywhere. 

Temperature Trend Measurement Issues

It has been my experience that anything associated with climate change issues is more complicated than it appears at first.  Britannica claims global warming has been the change in the surface temperature since 1850.  When I was responsible for setting up a meteorological monitoring network my first concern was the general location of the monitoring sites relative to the goal of the problem.  I wanted to site monitors evenly across the area of concern to represent what was happening.  In this case where and how should we sample for a global average.

The first global warming measurement challenge is representativeness.  Consider that 70% of the earth’s surface is covered by water and that long-term measurements are only available where people have been living.  Long-term measurements in the oceans are on islands and human settlements are not evenly distributed across the globe.  The Argo program addresses the ocean temperature representativeness issue with a system of 3,000 instrumented floats but it only has data since November 2007.    The Britannica global average temperature is not based on a representative global sample since 1850.

There is another representativeness issue that is even more of a concern.  The location of the monitor is critical if we are to compare measurements at one location to another.  Sensors should not be unduly affected by their surroundings.  For example, it is inappropriate to put a temperature sensor next to an external source of heat like an air conditioning system.  Another issue is that building and paved areas retain heat more than rural areas in what is called the urban heat island.  Temperature sensors should also be a minimum distance away from trees. 

The National Weather Service and the World Meteorological Organization (WMO) both have standards and guides for siting instruments that address these concerns.  Finally note that the WMO has a classification system for measuring stations.  Ideally, the only sites used for the global average would be those that meet the most stringent WMO acceptability criteria.  Using sites that do not meet those criteria in a trend analysis means local factors other than greenhouse warming could be influencing the observed trend.

A final representativeness trends concern is that siting standards should be constant over the period of record.  Consider that the longest measuring site in New York City is in Central Park.  The surroundings for that sensor have changed over time so there should not be high confidence that the warming trend observed there is caused solely by global greenhouse gas warming.

There is another long-term trend concern – measurement methodology.  The longest running instrumental temperature record is the Central England Temperature (CET) series.  The United Kingdom’s Met Office notes that “By collating and combining early instrumental records, the series charts monthly temperature statistics from 1659.”  Suffice to say that the temperature data collected for most of the record were observations of a thermometer, so this introduces human eyeball error.   

For locations where temperature records are still manually collected, observers note the maximum and minimum temperature recorded on an instrument that measures both values daily.  The first reliable max-min thermometer was invented in 1780 by James Six.  I do not know when the measurements used for the CET switched to this technology but the change in technique affects interpretation of the trend.  A bias can be introduced if the time of observation changes.  If observations are taken and the max-min thermometers are reset near the time of daily highs or lows, then an extreme event can affect two days and the resulting long-term averages.

Today many locations report temperature measured at locations with data acquisition computers.  Typically, those instrumental systems make observations every second, compute and save minute averages that are used to calculate and report hourly and daily averages.  Locations that have been measuring temperature for a long time may have started with manual observations and now use electronic observations.  This shift in methodology will affect the trend.

Trend Reporting

My focus in this article is the measurement of long-term temperature trends.  In the case of a daily average the issues described are small but cumulatively I believe are on the order of the observed trend.  However, unscrupulous advocates have been known to breathlessly report a new record temperature that they use to incite action.  For example, if a temperature sensor is improperly located so close to an airport runway that jet exhaust affects the temperature, and the maximum temperature reported is a one-minute average value, then the soundbite record temperature likely only represents the effect of jet exhaust.

I want to mention one final aspect of measuring programs that epitomizes an acceptable monitoring system.  There must be a quality assurance and quality control system in place.  Those programs include routine checks on the instruments and a verification process for the data itself.  For example, data verification was one of my responsibilities and I developed a program to evaluate data for potential problems.  If the observed wind direction data was constant for hours, the temperature was below freezing and there was precipitation, that indicated that freezing rain had frozen the wind vane in place.  I believe that climatological temperature reporting protocols include this step.  It is only when someone with a mission goes for the headline and unscrupulously reports data out of context that this can be a problem.

Conclusion

I prepared this article to highlight recognized instrumental and observational biases in the temperature measurements.  Individually the instrumental effects are small but cumulatively they can be on the same order as the trend.  The siting representativeness issues are a much bigger concern.  I have no doubts that the trends observed in many locations are primarily caused by increased urbanization and other local infrastructure changes affecting the measurements.

The Britannica website states, “the best estimate of the increase in global average surface temperature between 1850 and 2019 was 1.07 °C (1.9 °F).”  I believe that it is absurd to claim that level of precision given the issues I described.  Saying 1 °C (2 °F) is all you should say with any confidence but even that is low confidence in my opinion.  There is no question that there has been warming since the end of the Little Ice Age in 1850 but the amount and reason for the warming is debatable. 

Cherry Picking

I recently watched two videos related to climate change.  In Climate the Movie: The Cold Truth there is a very good description of historical temperatures and CO2 trends.  In a Debate on Climate Alarmism Dr. Jordan Peterson and Steven Bonnell II also addressed the link between temperature and CO2.  This article explains why Bonnell’s rationale that we must reduce CO2 emissions to avert catastrophe  includes an example of cherry picking “when people choose data that supports their position and “ignore evidence that they dislike”.

The rationale used for New York’s Climate Leadership & Community Protection Act (Climate Act) that reducing GHG emissions will affect climate is of special interest to me.  This example is a key component of that rationale.  I have followed the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 400 articles about New York’s net-zero transition. The opinions expressed in this post do not reflect the position of any of my previous employers or any other organization I have been associated with, these comments are mine alone.

Debate on Climate Alarmism

The video clip is only for a portion of longer discussion.  There were a couple of issues discussed that piqued my interest.  Peterson and Bonnell argued about the ethics of subjecting the world’s poor to hardships now in the hopes of preventing worse impacts in the future.  The subject of this post is their debate about climate model limitations and the historical record of temperature and CO2 emissions.

Bonnell supports the narrative that because we have recently seen the hottest years on record that coincides with increases in GHG emissions that this correlation proves causation.  He argued that because we just had another one of the hottest years that must mean something.  Peterson responded that the hottest period depends on the time frame.  They argued about which time frames should be used.  I am going to address Bonnell’s claim that from the start of the industrial age the temperatures have risen faster than in the past.  This is cherry picking because the start of the industrial age is just about the same as the end of the Little Ice Age.  I recently watched “Climate the Movie” and recalled that it included descriptions of temperature trends that contradicted this claim.

Climate the Movie: The Cold Truth

If you haven’t seen this video, then I strongly recommend that you do so before the thought police force it underground.  It does a superb job explaining the manufactured climate crisis, the biased science, and the implications of this mis-placed allocation of resources to “solve” it.  In addition, it is a great resource of pragmatic responses to the mainstream narrative.  Andy May has provided a great addition to the documentary with his Annotated Bibliography for it.  He provides references and supporting information for the material that I found very useful when putting this together.

Global Warming Trends

The rationale for changing the world’s energy system away from fossil fuels is the alleged link between global temperature trends and CO2 and other greenhouse gas emissions.  Climate the Movie confronts the mainstream narrative in this segment of the video.  Historical temperature trends over the last 50 million years show that we are at the end of an ice age and  activists are “saying it is too hot”.  The 5-million-year record shows a trend to lower temperatures accompanied by greater fluctuations.  Another graph covers the current ice age with lows during periods when the globe is covered in ice and slightly warmer periods when the glaciers are minimal.  The temperatures over the last 2,000 years are shown with Roman Warm Period, the cold dark age, the medieval warm period, and the Little Ice Age leading up to today. 

In my opinion, the variations over the last 2,000 years are compelling evidence that natural climate variation is so large that any tweak from a change in the greenhouse gas effect is minimal.  If I thought that we understood this natural climate variation, then I would be more receptive to claims that climate model projections for the future are credible.

The documentary discusses evidence that CO2 is a driver of climate change that is the ultimate rationale for New York’s Climate Act and any other plan to transform the energy system.  Recall that cherry picking “ignores evidence” inconsistent with advocacy arguments.  Historically CO2 and temperature are correlated but temperature increases before CO2 increases completely contrary to the premise.  This inconsistency is surely ignored evidence characterizing cherry picking.

The documentary also addresses Bonnell’s claim that the correlation of CO2 and temperature from the start of the industrial age is evidence that we can control the climate by limiting CO2 emissions.  This video segment compares recent CO2 emissions and temperature changes, but to rebut this claim Andy May’s Annotated Bibliography provides more persuasive documentation.

The Annotated Bibliography includes a section titled “From 1945 to 1976 the world cooled”.  It includes the following plot of global temperatures and carbon dioxide.  Bonnell believes that increasing temperatures since the end of the Little Ice Age are caused by increases in CO2.  This graph does not support that claim.  From 1850 to 1910 temperatures trend slightly down and CO2 trends slightly up.  From 1910 to 1944 there is little change in the CO2 trend but the temperature trends up markedly.  CO2 emissions don’t start to rise significantly until the end of World War II in 1945 but from 1944 to 1976 the global temperature trends down.  For the remaining two periods shown in the graph temperature and CO2 correlate well.

The following table lists the temperature trends (degrees C per century) for all five periods shown in the graph.  Bonnell’s claim that the correlation of CO2 and temperature from the start of the industrial age is proof that we can control the climate by limiting CO2 emissions is clearly contradicted by this information. In the first place, CO2 cannot be a driver until emission increases post 1944.  There is a good correlation between 1976 and the present but two things have to be ignored for the rationale to be valid.  Temperature did not trend upwards until 32 years after the CO2 emissions increased significantly and there was a similar increase in temperature from 1910 to 1944 as that observed since 1976.  I believe this shows that natural climate variation caused the 1910 to 1944 warming and I do not believe that anyone has proven that the same natural climate drivers are not affecting the recent warming.  I think you could even argue that the observed natural climate variation that caused the first warming of 1.4 deg C per century should be subtracted from the late 20th century warming of 1.8 deg C per century to put an bound on anthropogenic effects.  That means that CO2 induced warming could not be more than 0.4 deg C per century.  I do not think that represents catastrophic warming because it is much less than the interannual temperature variation observed.

Discussion

Watts Up With That recently re-published an article by Francis Menton that addressed New York’s desperate attempts to cover up the inescapable fact that using currently available wind, solar, and energy storage technologies will not work.   At his blog and Watts Up With That there are many more comments than I see here.  One comment caught my eye.

Warren Beaton claimed that deniers have no credibility left:

They can cite no evidence or peer reviewed scientific sources that contradict anthropogenic global warming. They have no consistent scientific theory of the behavior of the climate system. It’s ‘every man for himself’ in the Denial Community.  They cherry pick data and argue illogically.

I replied to that comment “I think that the new video Climate the Movie – The Cold Truth contradicts just about everything that you say here.”  His comment is a great example of my Pragmatic Environmentalist of New York Principle 5: The more vociferous/louder the criticisms made by a stakeholder the more likely that the stakeholder is guilty of the same thing.

Consider his comments relative to this issue.  Andy May has provided extensive evidence including peer reviewed articles documenting the observed temperature and CO2 emissions trends.  The basic tenet of anthropogenic global warming believers like Beaton is that the correlation between CO2 and global warming evident since 1976 proves that CO2 is the control knob for climate.  Simple analysis shows that there is no correlation between 1850 and 1976 and there was a similar period of warming from 1910 to 1944 so that claim cannot be true.

The issue of no consistent scientific theory describes the unfortunate fact that we do not understand natural climate variability.  The warming since 1850 has been inconsistent and must include significant natural inputs but there is no agreement about those effects.  Until we understand natural drivers I cannot see any reason to place any faith in projections of climate out for hundreds of years.

Finally, the claim that deniers cherry pick data is ripe (sorry I could not resist the pun) for comment.  Bonnell simply repeats the mantra that since 1850 temperatures have gone up and GHG emissions have gone up so there must be a link.  I showed that to make that argument he had to cherry pick the data to support the claim.

Conclusion

Climate the Movie: The Cold Truth is a valuable resource to address the over-simplified theory of anthropogenic climate change due to greenhouse gas emissions.  In this example, the claim that recent record warmth has to be related to those emissions is not supported by the trends of warming and emissions since 1850.  That claim can only be justified by cherry picking data.

Articles of Note October 15 2023

Sometimes I just don’t have time to put together an article about specific posts about the net-zero transition and climate change that I have read that I think are relevant.  This is a summary of posts that I think would be of interest to my readers.

I have been following the Climate Leadership & Community Protection Act (Climate Act)

Climate Act since it was first proposed and most of the articles described are related to it. I have devoted a lot of time to the Climate Act because I believe the ambitions for a zero-emissions economy embodied in the Climate Act outstrip available renewable technology such that the net-zero transition will do more harm than good.  .  The opinions expressed in this article do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

lawsuit is filed in New York.

Climate Act Background

The Climate Act established a New York “Net Zero” target (85% reduction and 15% offset of emissions) by 2050.  It includes an interim 2030 reduction target of a 40% reduction by 2030 and a requirement that all electricity generated be “zero-emissions” by 2040. The Climate Action Council is responsible for preparing the Scoping Plan that outlines how to “achieve the State’s bold clean energy and climate agenda.”  In brief, that plan is to electrify everything possible and power the electric grid with zero-emissions generating resources.  The Integration Analysis prepared by the New York State Energy Research and Development Authority (NYSERDA) and its consultants quantifies the impact of the electrification strategies.  That material was used to write a Draft Scoping Plan.  After a year-long review the Scoping Plan recommendations were finalized at the end of 2022.  In 2023 the Scoping Plan recommendations are supposed to be implemented through regulation and legislation. 

Videos of Note

For those of you who would rather watch a video than read about a topic I list a few interesting videos.  This video describes historic global temperatures and how ancient temperatures are estimated.  I think it does a good job describing a complicated subject.

This interview of Judith Curry by John Stossel is a good overview of the climate science hype.

Offshore Wind Costs

Renewable developments are struggling due to soaring interest rates and rising equipment and labor costs.  Reuters describes two “procured” projects that have been cancelled:

On Monday, Avangrid (AGR.N), a U.S. subsidiary of Spanish energy firm Iberdrola (IBE.MC), said it filed agreements with power companies in Connecticut to cancel power purchase agreements for Avangrid’s proposed Park City offshore wind project.

“One year ago, Avangrid was the first offshore wind developer in the United States to make public the unprecedented economic headwinds facing the industry,” Avangrid said in a release.

Those headwinds include “record inflation, supply chain disruptions, and sharp interest rate hikes, the aggregate impact of which rendered the Park City Wind project unfinanceable under its existing contracts,” Avangrid said.

Avangrid has said it planned to rebid the Park City project in future offshore wind solicitations.

Also over the past week, utility regulators in Massachusetts approved a proposal by SouthCoast Wind, another offshore wind developer, to pay local power companies a total of around $60 million to terminate contracts to provide about 1,200 MW of power.

Rich Ellenbogen described how the Offshore Wind Market is broken all over the world in an email.  First he mentioned this Avangrid project buyout of their contractual obligations.  He also pointed out that at a recent  UK wind auction, there were no bidders because the maximum selling price for the electricity was not high enough to justify the investment.  Their installation costs have risen by about 40% and the UK government did not factor that in to the allowable costs. He explains:

The article states that the wholesale price of electricity in the UK is £80 /Megawatt hour (MWh).  With an exchange rate of $1.23 per pound-sterling, that equates to  $98.40 per MWh.  The article also states that they would need £60  per MWh to make the wind farms profitable, or $73.80 per MWh.   However, according to this link, “the wholesale price for electricity in NY State in calendar year 2023 has increased from $24.57/MWh to $42.97/MWh over the last year.”, 47% lower than the wholesale cost in the UK  and 72% lower than what the wind installers say that they need to be profitable.

If the Wind installers can get $73.80/MWh installing wind farms in the UK but they can only get $42.97/MWh installing Wind farms here, 42% less,  while also having no ships to do the installation because of the Jones Act, where do you think that they will install the wind farms?  This is a global market.

The other way to look at this is that the energy from Offshore Wind will cost 72% more than what the ratepayers of NY State are currently paying. This is not a good economic model for the NY State rate payers.  72% increases are well outside of what surveys have said the public will tolerate.  Coupled with 15% increase in delivery costs from the utilities, the number of ratepayers, currently  1.2 million ratepayers that are $1.8 billion in arrears, will greatly increase making NY State even less affordable than it already is.

In New York, on October 12, 2023 the Public Service Commission turned down a request to address the same cost issues.  Times Union writer Rick Karlin summarizes:

At issue was a request in June by ACE NY, as well as Empire Offshore Wind LLC, Beacon Wind LLC, and Sunrise Wind LLC, which are putting up the offshore wind tower farms.

All told, the request, which was in the form of a filing before the PSC, represented four offshore wind projects totaling 4.2 gigawatts of power, five land-based wind farms worth 7.5 gigawatts and 81 large solar arrays.

All of these projects are underway but not completed. They have already been selected and are under contract with the New York State Energy Research and Development Authority, or NYSERDA, to help New York transition to a clean power grid, as called for in the Climate Leadership and Community Protection Act, approved by the state Legislature and signed into law in 2019.

Developer response suggests that “a number of planned projects will now be canceled, and their developers will try to rebid for a higher price at a later date — which will lead to delays in ushering in an era of green energy in New York”. Karlin also quotes Fred Zalcman, director of the New York Offshore Wind Alliance: “Today’s PSC decision denying relief to the portfolio of contracted offshore wind projects puts these projects in serious jeopardy,”

Francis Menton did an overview of the status of offshore wind projects that summarizes all the issues confronting offshore wind development.

Renewable Costs

Francis Menton also did an overview of renewable costs.

Another article in the Telegraph also addresses green energy costs.

Weather and Climate

The September edition of Climate Fact Check debunks ten bogus climate claims from last month.  There is a description of the analysis here

Electric Vehicles

Electric van maker on verge of bankruptcy

EV owners facing soaring insurance costs

How to Publish a High-Profile Climate Change Research Paper

Regular readers of this blog have noticed that there aren’t many articles in high-profile journals that suggest there are any issues with the narrative that climate change impacts are pervasive and catastrophic. Patrick T. Brown explains that “There is a formula for publishing climate change impacts research in the most prestigious and widely-read scientific journals. Following it brings professional success, but it comes at a cost to society.”  His formula explains part of the reason we see so little skeptical research in those journals.

The biggest topic on this blog is climate change and the proposed greenhouse gas emission reduction solutions.  From what I have seen the pressure to conform to the narrative described here is immense and it should be kept in mind by my readers.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

Background

Patrick T. Brown is a Ph.D. climate scientist. He is a Co-Director of the Climate and Energy Team at The Breakthrough Institute and is an adjunct faculty member (lecturer) in the Energy Policy and Climate Program at Johns Hopkins University. 

This month, he published a lead-author research paper in Nature on changes in extreme wildfire behavior under climate change. This is his third publication in Nature to go along with another in Nature’s climate-focused journal Nature Climate Change. He notes that “because Nature is one of the world’s most prestigious and visible scientific journals, getting published there is highly competitive, and it can significantly advance a researcher’s career.” 

His article is based on this publication experience, as well as through various failures to get research published in these journals.  He explains:

I have learned that there is a formula for success which I enumerate below in a four-item checklist. Unfortunately, the formula is more about shaping your research in specific ways to support pre-approved narratives than it is about generating useful knowledge for society.

Formula for Publishing Climate Changes Impact Research

Before describing his approach to get research published, he describes what is needed for useful scientific research.  He says:

It should prize curiosity, dispassionate objectivity, commitment to uncovering the truth, and practicality. However, scientific research is carried out by people, and people tend to subconsciously prioritize more immediate personal goals tied to meaning, status, and professional advancement. Aligning the personal incentives that researchers face with the production of the most valuable information for society is critical for the public to get what it deserves from the research that they largely fund, but the current reality falls far short of this ideal.

Brown explains that the “publish or perish” mentality in academic research is necessary.  In addition, it also matters “which journals you publish in”.  It turns out a “researcher’s career depends on their work being widely known and perceived as important.”  Because there is so much competition now it has become more important to publish in the highly regarded journals”  “while there has always been a tremendous premium placed on publishing in the most high-profile scientific journals – namely Nature and its rival Science – this has never been more true.”  As a result, “savvy researchers will tailor their studies to maximize their likelihood of being accepted.”  In his article he explains just how he did it.

First, he offers general advice:

My overarching advice for getting climate change impacts research published in a high-profile journal is to make sure that it supports the mainstream narrative that climate change impacts are pervasive and catastrophic, and the primary way to deal with them is not through practical adaptation measures but through policies that reduce greenhouse gas emissions. Specifically, the paper should try to check at least four boxes.

The first box to hit is that it is that “climate change impacts something of value is usually sufficient, and it is not typically necessary to show that the impact is large compared to other relevant influences.”  In order to do this there are tradeoffs:

In my recent Nature paper, we focused on the influence of climate change on extreme wildfire behavior but did not bother to quantify the influence of other obviously relevant factors like changes in human ignitions or the effect of poor forest management. I knew that considering these factors would make for a more realistic and useful analysis, but I also knew that it would muddy the waters and thus make the research more difficult to publish.

This type of framing, where the influence of climate change is unrealistically considered in isolation, is the norm for high-profile research papers. For example, in another recent influential Nature paper, they calculated that the two largest climate change impacts on society are deaths related to extreme heat and damage to agriculture. However, that paper does not mention that climate change is not the dominant driver for either one of these impacts: temperature-related deaths have been declining, and agricultural yields have been increasing for decades despite climate change.

The second box is to avoid discussion of anything that could reduce the impact of climate change:

This brings me to the second component of the formula, which is to ignore or at least downplay near-term practical actions that can negate the impact of climate change. If deaths related to outdoor temperatures are decreasing and agricultural yields are increasing, then it stands to reason that we can overcome some major negative effects of climate change. It is then valuable to study how we have been able to achieve success so that we can facilitate more of it. However, there is a strong taboo against studying or even mentioning successes since they are thought to undermine the motivation for emissions reductions. Identifying and focusing on problems rather than studying the effectiveness of solutions makes for more compelling abstracts that can be turned into headlines, but it is a major reason why high-profile research is not as useful to society as it could be.

His third component is to focus the presentation on alarm:

A third element of a high-profile climate change research paper is to focus on metrics that are not necessarily the most illuminating or relevant but rather are specifically designed to generate impressive numbers. In the case of our paper, we followed the common convention of focusing on changes in the risk of extreme wildfire events rather than simpler and more intuitive metrics like changes in the amount of acres burned. The sacrifice of clarity for the sake of more impressive numbers was probably necessary for it to get into Nature

Another related convention, which we also followed in our paper, is to report results corresponding to time periods that are not necessarily relevant to society but, again, get you the large numbers that justify the importance of your research. For example, it is standard practice to report societal climate change impacts associated with how much warming has occurred since the industrial revolution but to ignore or “hold constant” societal changes over that time. This makes little sense from a practical standpoint since societal changes have been much larger than climate changes since the 1800s. Similarly, it is conventional to report projections associated with distant future warming scenarios now thought to be implausible while ignoring potential changes in technology and resilience.

The good news is that Brown has transitioned out of a tenure-track academic position to one that does not require high-impact publications.  He explains a better approach than what is necessary to publish there:

A much more useful analysis for informing adaptation decisions would focus on changes in climate from the recent past that living people have actually experienced to the foreseeable future – the next several decades – while accounting for changes in technology and resilience. In the case of my recent Nature paper, this would mean considering the impact of climate change in conjunction with proposed reforms to forest management practices over the next several decades (research we are conducting now). This more practical kind of analysis is discouraged, however, because looking at changes in impacts over shorter time periods and in the context of other relevant factors reduces the calculated magnitude of the impact of climate change, and thus it appears to weaken the case for greenhouse gas emissions reductions. 

The final key to publication is presentation:

The final and perhaps most insidious element of producing a high-profile scientific research paper has to do with the clean, concise format of the presentation. These papers are required to be short, with only a few graphics, and thus there is little room for discussion of complicating factors or contradictory evidence. Furthermore, such discussions will weaken the argument that the findings deserve the high-profile venue. This incentivizes researchers to assemble and promote only the strongest evidence in favor of the case they are making. The data may be messy and contradictory, but that messiness has to be downplayed and the data shoehorned into a neat compelling story. This encouragement of confirmation bias is, of course, completely contradictory to the spirit of objective truth-seeking that many imagine animates the scientific enterprise.

Brown explains that despite the allowances he had to make to get it his work published there still is value in it:

All this is not to say that I think my recent Nature paper is useless. On the contrary, I do think it advances our understanding of climate change’s role in day-to-day wildfire behavior. It’s just that the process of customizing the research for a high-profile journal caused it to be less useful than it could have been. I am now conducting the version of this research that I believe adds much more practical value for real-world decisions. This entails using more straightforward metrics over more relevant timeframes to quantify the impact of climate change on wildfire behavior in the context of other important influences like changes in human ignition patterns and changes in forest management practices.

Brown explains his motivations for this post and his new plans:

But why did I follow the formula for producing a high-profile scientific research paper if I don’t believe it creates the most useful knowledge for society? I did it because I began this research as a new assistant professor facing pressure to establish myself in a new field and to maximize my prospects of securing respect from my peers, future funding, tenure, and ultimately a successful career. When I had previously attempted to deviate from the formula I outlined here, my papers were promptly rejected out of hand by the editors of high-profile journals without even going to peer review. Thus, I sacrificed value added for society in order to for the research to be compatible with the preferred narratives of the editors.

I have now transitioned out of a tenure-track academic position, and I feel liberated to direct my research toward questions that I think are more useful for society, even if they won’t make for clean stories that are published in high-profile venues. Stepping outside of the academy also removes the reservations I had to call out the perverse incentives facing scientific researchers because I no longer have to worry about the possibility of burning bridges and ruining my chances of ever publishing in a Nature journal again.

Brown concludes:

So what can shift the research landscape towards a more honest and useful treatment of climate change impacts? A good place to start would be for the editors of high-profile scientific journals to widen the scope of what is eligible for their stamp of approval and embrace their ostensible policies that encourage out-of-the-box thinking that challenges conventional wisdom. If they can open the door to research that places the impacts of climate change in the appropriate context, uses the most relevant metrics, gives serious treatment to societal changes in resilience, and is more honest about contradictory evidence, a wider array of valuable research will be published, and the career goals of researchers will be better aligned with the production of the most useful decision support for society.

My Conclusion

It is no wonder that all we hear from greenhouse gas emission reduction advocates is that climate change is an existential threat because the “science” says so.  Peeking around the curtain shows that the “science” has been perverted to reinforce and maintain this narrative.  I applaud Brown for giving insight into the way this is done.

This sums up a primary motivator for my work on this blog. New York’s planned transition to a net zero economy is a solution to a non-existent problem.  I have shown that New York GHG emissions are less than one half of one percent of global emissions and global emissions have been increasing on average by more than one half of one percent per year since 1990 so even if there was a problem our actions cannot make a difference.  Worse, the so-called solution has enormous reliability risks, eye-watering costs, and under evaluated environmental impacts.  There is no redeeming virtues to New York’s net-zero transition plan.

Articles of Note Relevant to the Climate Act

I have a “to-do” list of posts and analyses that I want to do.  Some items on the list are over a month old.  Rather than adding to the list with articles about specific posts that I have read that I think are relevant, this post describes articles that caught my attention.

I have been following the Climate Leadership & Community Protection Act (Climate Act) since it was first proposed and most of this blog articles are related to it. I have devoted a lot of time to the Climate Act because I believe the ambitions for a zero-emissions economy embodied in the Climate Act outstrip available renewable technology such that the net-zero transition will do more harm than good.  .  The opinions expressed in this article do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

Weather and Climate

It seems that every day there is at least one article that claims that a recent extreme weather event is related to climate change caused by humans.  Roger Pielke, Jr was prompted to write a post after the Lahaina, Maui fire disaster link to climate change.  The post Signal and Noise that addresses this issue and includes the following lessons:

Just because the signal of climate change for particular variables cannot (yet) be detected in the context of historical variability does not mean that climate change is not real or important, and in many, if not most cases, a lack of signal is to be expected.

Natural variability is real and significant. It does not mean that climate change is not real or important, but that detecting signals is often difficult even when climate is changing and there is always a risk of erroneously detecting signals where none is present.

He concludes:

The challenges of detection and attribution should tell us that both adaptation and mitigation policies must be built upon a foundation that involves justifications for action that are much broader than climate change alone.

So far, climate advocates have sought to shape perceptions of science to support a climate-change-is-everything agenda. We will have a lot more success if we instead shape policy to align with what science actually says.

My pragmatic take concerns the tradeoff between the resources devoted to climate change mitigation relative to extreme weather adaptation.  I was involved with emergency planning at a nuclear plant so I have experience with this kind of planning.  From what I have seen the emergency planning for Lahaina relative to an identified wildfire threat was criminally negligent.  Blaming climate change absolves blame for the guilty parties.  Trying to mitigate climate change displaces resources that would be better employed to address existing weather problems. 

I wrote the preceding paragraph before I read that Bjorn Lomborg said that politicians were blaming climate change for disasters like the wildfires on Maui to duck “responsibility” for “failures” in addressing them.  It is reassuring that my thoughts agree with him.

One final note about the Maui fires.  Professor Cliff Mass did an excellent job explaining what really happened: a high amplitude atmospheric wave forced by strong winds interacting with the mountains of northwest Maui.  He explains that:

It did not matter whether the grass or light vegetation were wet or dry the days or weeks before:  this extraordinary atmospheric animal would ensure they were dry enough to burn.   Prior dry conditions during the weeks before were immaterial.

With respect to the current state of the climate, Judith Curry, Jim Johnstone, and Mark Jelinek presented a “deep dive into the causes of the unusual weather/climate during 2023.  People are blaming fossil-fueled warming and El Nino, and now the Hunga-Tonga eruption and the change in ship fuels.  But the real story is more complicated.”  They conclude, among other things:

The exceptionally warm global temperature in 2023 is part of a trend of warming since 2015 that is associated primarily with greater absorption of solar radiation in the earth-atmosphere system.  This increase in absorbed solar radiation is driven by a slow decline in springtime snow extent, but primary by a reduction in reflection from the atmosphere driven by reduced cloudiness and to a lesser extent a reduction in atmospheric aerosol.  Any increase in the greenhouse effect from increasing CO2 (which impacts the longwave radiation budget) is lost in the noise.

Climate Emergency

Alex Epstein writes that it would be inappropriate for the Biden Administraiton to declare a Climate Emergency.  He argues that there is no emergency because rising CO2 levels are:

  1. Not dire: Humans are safer from climate than ever.
  2. Not temporary: They will rise for decades.
  3. Not in our control: We emit 1/7 of CO2—and falling.

He makes the point that a government “emergency” declaration is a temporary increase in power that should only be used if a problem meets three criteria:

  • Dire: Unusually deadly
  • Temporary: Of limited duration
  • In our control: Actually solvable by our government

His conclusion is that none of the conditions necessary to declare a climate emergency have been met and goes on to support his arguments in detail.

Climate Act and Electric Vehicles

The Climate Act is a political animal.  While I focus primarily on the environmental and energy-related issues associated with GHG emission reductions, there is a social justice aka “Green New Deal”  component that is a primary interest of many of the Act’s proponents.  The contradiction between advocating for zero GHG emissions that will markedly increase energy prices and risk electric reliability that will impact those that can least afford to deal with the problems the most while at the same time demanding investments in disadvantaged communities has always seemed incongruous to me.  No where is this tradeoff more stark than in the push for electric vehicles.

The CalMatters post Will California’s push on electric vehicles reduce inequality — or deepen it? touched on the issues that concern me.  The post described a CalMatters panel discussion that addressed the question whether California can make sure the electric vehicle revolution isn’t just for the wealthy few. 

The post noted:

While bringing down the cost of EVs is crucial, so is the availability of chargers. And that is something of a chicken-and-egg proposition.

Some on the panel — moderated by CalMatters’ climate reporter Alejandro Lazo — called for building out the charging infrastructure in disadvantaged communities in advance, especially residential chargers.

  • Steve Douglas, vice president of energy and environment for the Alliance for Automotive Innovation: “You can’t ask low-income residents to spend an hour, three hours, six hours away from their families, every week, just to charge their car, while affluent people pull in, plug in and wake up to a full car.”

But others said without enough EV owners in a neighborhood, it’s a recipe for vandalism and disuse. 

  • Ted Lamm, senior research fellow in the climate program at UC Berkeley’s Center for Law, Energy, & the Environment: “When charging is installed in an area where there is no demand for the vehicles and no local desire to use them, it’s this sort of dead infrastructure. It has no use to the local population and local community, and so it is more likely to be subjected to vandalism, or just disuse and disrepair.”

Montana Court Climate Decision

A group of young people in Montana won a landmark lawsuit on August 14, when a judge ruled as unconstitutional the state’s failure to consider climate change when approving fossil fuel projects. While this has been hailed as a turning point by the usual suspects the reality is different.

David Wojick explained why the court decision was not a big deal.  He writes:

Much ado is being made from the supposed win of a kid’s climate lawsuit in Montana. The alarmists call it a victory, the skeptics a tragedy, but it is neither. What was won is almost funny, while the big ask was in fact denied. The climate kids won a little, but lost a lot.

On the win side the judge merely ruled that the Montana law forbidding consideration of GHG emissions in permitting was unconstitutional. How it is considered is up to the agency or legislature. This need not slow down or stop any project.

The Montana constitution says there is a right to a healthful environment. Alarmism says emissions are harmful which all Courts to date have bought, including this one. So given the possible harm, one cannot simply ignore emissions which the law said to do. Hence the decision to kill the law.

Gregory Wrightstone debunked the claim that Montana is a “major emitter of greenhouse gas emissions in the world” and the state’s emissions “have been proven to be a substantial factor” in affecting the climate.   He explains:

Montana’s COemissions are 0.6% of the total U.S. emissions. If Montana had gone to zero emissions of CO2 in 2010, it would only avert 0.0004 degree Fahrenheit of greenhouse warming by 2050 and 0.001 degree by 2100, according to the MAGICC simulator, a tool created by a consortium of climate research institutes including the National Center for Atmospheric Research. These numbers are far below our ability to even measure and certainly not the “substantial factor” as claimed.

New York’s emissions are a greater proportion of total U.S. emissions but I have found they are not high enough to measure and are also not a “substantial factor”.

The Montana Attorney General’s office considered arguing against the plaintiff’s witnesses about the alleged harms of climate change.  They retained Dr. Judith Curry to prepare evidence but ended up not using it.  She explained the inside story, her written expert report, and why she was not asked to testify at the trial.  I found it fascinating and there is plenty of ammunition included to debunk many of the arguments used by proponents of the net-zero transition.    This will be useful when the inevitable lawsuit is filed in New York.

July Climate Alarmism

It seems that every day we are faced with another claim that we are facing an existential threat from climate change and the proof is right in front of us.  So simple, so obvious and so wrong.  I do not have time to do my own analysis so I am going to use the work of others to rebut the fear mongering stories about these events tied to climate change in July.

July was the Hottest Month Ever

The story that July was the hottest month in 120,000 years is the best example of the media glomming on to a story that does not stand up to scrutiny.  A post at Watts Up With That explains:

From CLIMATE DEPOT

Via The Australian: Cliff Mass, professor of Atmospheric Sciences at University of Washington, said the public was being “misinformed on a massive scale”: “It‘s terrible. I think it’s a disaster. There’s a stunning amount of exaggeration and hype of extreme weather and heatwaves, and it’s very counter-productive,” he told The Australian in an interview. “I’m not a contrarian. I‘m pretty mainstream in a very large [academic] department, and I think most of these claims are unfounded and problematic”. …

Professor Mass said the climate was “radically warmer” around 1000 years ago during what’s known as the Medieval Warm Period, when agriculture thrived in parts of now ice-covered Greenland. “If you really go back far enough there were swamps near the North Pole, and the other thing to keep in mind is that we‘re coming out of a cold period, a Little Ice Age from roughly 1600 to 1850”.

#

John Christy, a professor of Atmospheric Sciences at the University of Alabama at Huntsville, said heatwaves in the first half of the 20th century were at least as intense as those of more recent decades based on consistent, long-term weather stations going back over a century. “I haven‘t seen anything yet this summer that’s an all-time record for these long-term stations, 1936 still holds by far the record for the most number of stations with the hottest-ever temperatures,” he told The Australian, referring to the year of a great heatwave in North America that killed thousands. 

Professor Christy said an explosion of the number of weather stations in the US and around the world had made historical comparisons difficult because some stations only went back a few years; meanwhile, creeping urbanization had subjected existing weather stations to additional heat. “In Houston, for example, in the centre it is now between 6 and 9 degrees Fahrenheit warmer than the surrounding countryside,” he explained in an interview with The Australian.

Professor Christy, conceding a slight warming trend over the last 45 years, said July could be the warmest month on record based on global temperatures measured by satellites – “just edging out 1998” – but such measures only went back to 1979.

Phoenix Heat Wave

Phoenix Arizona had a streak of 31 days when the high temperature was 110 degrees or higher.  The article, “Explaining The Heat Wave: Separating Weather From Climate Change,”  claims that recent warming trends in Phoenix, Arizona are due primarily to increasing atmospheric carbon dioxide levels in the atmosphere. However, this is false because data show that the high levels of warming, especially at night and as measured at an airport, are primarily due to urbanization over time, with the modest warming of the past hundred-plus years playing a very small part in comparison.  Another rebuttal notes:

Deadly Summer in the Southwest

Kip Hansen addresses the story:  “A Deadly Summer for Hikers in the Southwest” “At least seven heat-related deaths are suspected in state and national parks during a record-breaking heat wave.” 

He explains:

But, it must be climate change, look how hot it was!”  My dear readers, that’s why they named it Death Valley.    The Monthly Report from the U.S. National Weather Service for the Death Valley station shows that every day during July this year, the average daily temperature (Daily Maximum + Daily Minimum divided by 2) was in excess of 100 °F (37.7 °C).  That’s the average!    The daily highs were above 110 °F (43 °C) every single day, above 120 °F (49 °F) twenty of the days. 

Is this unusual?  Is this “extreme”? No, the U.S. National Park Service reports on the general the Weather in Death Valley “Death Valley is famous as the hottest place on earth and driest place in North America. The world record highest air temperature of 134°F (57°C) was recorded at Furnace Creek on July 10, 1913. [ emphasis mine – kh ] Summer temperatures often top 120°F (49°C) in the shade with overnight lows dipping into the 90s°F (mid-30s°C.) Average rainfall is less than 2 inches (5 cm), a fraction of what most deserts receive. Occasional thunderstorms, especially in late summer, can cause flash floods.”  All of those conditions, except the record high temperature of 1913, occurred this summer in Death Valley, just as the National Park Service advised visitors to expect.  There was not any extreme weather, it was usual weather for Death Valley.

Climate Fact Check

If you want short rebuttal summaries to these and other false climate change stories for July check out this fact check report.  It covers the following stories: monthly average temperature is the hottest, the UN proclamation that we are in an era of global boiling, the hottest day in 125,000 years, Atlantic current to collapse by 2025, record for hot days in Phoenix, hottest day in Death Valley, emissions causing hot oceans, hottest seawater ever, and more. 

Heat Health Impacts

The rationale for alarm for the excessive heat stories is the argument that heat results in more deaths than any other weather-related event.  Five years ago I explained why there are analyses that find “most of the temperature-related mortality burden was attributable to the contribution of cold”.  Studies that show that extreme heat results in more deaths than any other weather-related event use a data base that only includes direct deaths.  An epidemiological study that does include indirect deaths concludes most deaths are associated with moderate cold weather.  Roger Pielke Jr. reports how this information can be presented to support the alarmist version:

The Lancet was caught red-handed publishing a figure that, to be as fair as possible, lent itself to misinterpretation (it was first called to my attention by Bjorn Lomborg and is in a paper by Masselot et al. 2023).

Take a look and decide for yourself. Here is the original figure comparing mortality from cold (blue) and heat (orange) in Europe from 2000-2019.

And here is how it looks when the data is graphed using a consistent scale.

Another Examples of Propaganda

We have all seen the graphs that show inexorable global warming.  However this article describes how “alarmist scientists have scared the bejesus out of people by turning a very small temperature change into a monster.”   Jim Steele writes:

Dr. Lindzen graphed the average seasonal anomalies for each weather station in the BEST temperature data base from 1900 to the present. A station’s anomaly is defined as any deviation from its 30-year mean. The results are not very scary. On any given day about half the weather stations experience warm anomalies while half experience cooling anomalies.

Most anomalies cluster between ± 4°C (+/- 7.2°F) causing each data point to merge into the thick black band of the graph. Still, larger anomalies are not uncommon, so the y-axis of the above graph scales between ± 12°C (+/- 21.6°F). The yellow dots represent the average for those anomalies on any given day. We see a small trend that is relatively tiny compared to the variation in actual temperatures. Not very scary either.

So, the showtime graphs isolate the average anomalies from reality, as done in the bottom graph. Now the scale on the y-axis only spans from -0.8°C (-1.4°F) to 1.2°C (2.2°F), turning a small 1°C (1.8°F) rise over 120 years into the illusion of a monster increase. That allows click-bait media, alarmists scientists and politicians to claim that climate change could lead to mass extinctions.

Reporting Issues Influence Results

Roger Pielke Jr. is an expert on the topic of global disaster accounting.  He recently posted an article that makes two relevant points to this post:

Below is the updated time series of global hydrological, climatological and meteorological disasters in the EM-DAT database, along with the linear trend, over the period 2000 to 2022.

You can see that there is no upwards trend. This lack of trend has not been reported by anyone in the legacy media (and I would be happy to be corrected). However, the completely false notion that global weather and climate disasters have increased and will continue to increase is commonly reported in the legacy media, buoyed by the promotion of false information by organizations that include the United Nations. In 2020 the U.N. claimed falsely of a “staggering rise in climate-related disasters over the last twenty years.”

The second point he makes is careful examination of the disaster data clearly shows that “the increase in disasters in its database to 2000 is due to better reporting, and not changes in underlying counts of actual disasters.”  He concludes: “Regardless what happens with trends in disaster counts, it is absolutely essential to remember that if you are looking for a signal of changes in climate — always look directly at weather and climate data, not data on economic or human impacts.”

Conclusion

There is a constant barrage of doom and gloom articles connecting any extreme weather event or disaster to the existential threat of climate change.  In my opinion they all are more propaganda than unbiased reporting.  Every time I have checked a weather event attributed to climate change claim on my own, I have found that the issue is more complex and less threatening than portrayed.  Don’t get scared by these stories!

Syracuse Post-Standard Climate Change Opinions

On July 2, 2023 the Syracuse Post Standard published my letter to the editor Expert’s view of solar energy’s potential in NY is far too sunny that responded to an earlier commentary  Five Reasons New Yorkers Should Embrace a Solar Energy Future by Richard Perez, Ph.D.  I appreciated the fact that they published my rebuttal but I did find it interesting that the following week that there were three guest opinions that also deserve rebuttals.  Given that there are limitations on how often I can get letters published I will have to settle for commentary here.

New York’s response to climate change is the Climate Leadership & Community Protection Act (Climate Act).  I have been following the Climate Act since it was first proposed, submitted comments on the Climate Act implementation plan, and have written over 300 articles about New York’s net-zero transition.  I have devoted a lot of time to the Climate Act because I believe the ambitions for a zero-emissions economy embodied in the Climate Act outstrip available renewable technology such that the net-zero transition will do more harm than good.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

Climate Act Background

The Climate Act established a New York “Net Zero” target (85% reduction and 15% offset of emissions) by 2050 and an interim 2030 target of a 40% reduction by 2030. The Climate Action Council is responsible for preparing the Scoping Plan that outlines how to “achieve the State’s bold clean energy and climate agenda.”  In brief, that plan is to electrify everything possible and power the electric grid with zero-emissions generating resources by 2040.  The Integration Analysis prepared by the New York State Energy Research and Development Authority (NYSERDA) and its consultants quantifies the impact of the electrification strategies.  That material was used to write a Draft Scoping Plan.  After a year-long review the Scoping Plan recommendations were finalized at the end of 2022.  In 2023 the Scoping Plan recommendations are supposed to be implemented through regulation and legislation. 

The three commentaries described here all claim that more action is needed because of the climate crisis.  All three overestimate the impacts and underestimate the challenges.  All three authors have vested interests in their narratives that I believe go beyond environmental concerns.  I describe the commentaries below.

Climate Change is Here

The first page of the editorial section of the Sunday Post-Standard led with a guest opinion, Climate change is here in CNY – We can do something about it.  The author was Katelyn M. Kriesel who is a socially responsible financial advisor, a town councilor for the town of Manlius, chair of Sustainable Manlius, and candidate for Congress.  She opined that the wildfire smoke was an indicator of climate change:

The Canadian wildfires are not normal. More than 11 million acres have burned or are on fire, decimating forests, killing wildlife and threatening homes. This is due to record drought, shifting weather patterns, and a changing climate.

What’s to stop it from happening here? If you think the smoke was bad, wait until we have our own wildfires.

Her arguments that the weather is getting worse around here rely entirely on anecdotal evidence that does not stand up to examination.  For example, she ignores similar poor air quality events from wildfires during the Little Ice Age 200 years ago when she claims that the wildfire smoke is due to a changing climate.

She goes on to provide an oversimplified explanation of the greenhouse effect and claims that ignoring the emissions will lead to catastrophe: “As our planet gets warmer, weather patterns change, causing extreme temperatures, droughts and floods. As this continues, climate change worsens.”  I have no doubt that she believes that “The only solution is to decrease carbon and methane emissions” and that the personal actions she advocates are necessary. 

I also have no doubt that no one could convince her otherwise.  Not even Dr. Bjorn Lomborg who was named one of TIME magazine’s 100 most influential people in the world.  His latest book is entitled “False Alarm: How Climate Change Panic Costs Us Trillions, Hurts the Poor, and Fails to Fix the Planet”.  In it he refutes all the points made in this commentary.  I recommended his book three years ago and reiterate that recommendation now.

EV Infrastructure

The other article featured on the front page of the Sunday Post-Standard was titled NY’s economic future requires robust, reliable EV infrastructureMark Lichtenstein described his belief that electric vehicle (EV) infrastructure is necessary: “If we delay, we risk falling short during this critical time to strengthen our economy, attract a talented workforce, improve our environment, and lead New York’s advance into a clean energy future.”  He is executive operating and chief sustainability officer at the SUNY College of Environmental Science and Forestry, in Syracuse.

He gave EV overview information and argued that recent growth in the EV vehicle registrations portends future success.  Notably lacking is that the numbers he presented lacked context.  For example, “New York is leading the way — as one of the top five states for EV registrations — with just over 139,100 EVs as of this April” sounds great but not mentioned is that this is less than one half a percent of total registrations.

The point of his commentary was that New York must do more to encourage the transition.  He listed “key pieces to the puzzle” that need to be addressed:

  1. Will our electric generation also be climate-friendly?
  2. Can our electric distribution infrastructure handle the increase in demand?
  3. How and where will we charge these new EVs?
  4. Can we improve the speed and convenience of chargers? and
  5. Will we effectively address any associated environmental concerns related to the materials needed to construct EVs, as well as the safe disposal of components?

He argued that these issues need to be resolved:

The demand for this enhanced effort is immediate, as Central New York is currently poised for a significant transformation. It must happen now. Consider that Micron is bringing nearly 50,000 jobs and a host of supplier businesses to the region over the next two decades. This requires an infrastructure that can support a massive new amount of electrified passenger vehicles, as well as the medium- and heavy-duty trucks expected to make up an increasingly large share of the EV fleet.

If we delay, we risk falling short during this critical time to strengthen our economy, attract a talented workforce, improve our environment, and lead New York’s advance into a clean energy future.

Personally, I don’t think that the EV transition will strengthen our local economy because the significant costs necessary to support it will divert money away from our economy.  No one is claiming that the vehicles, batteries, and charging infrastructure will be constructed here so all that money will go elsewhere.  I also doubt that EV infrastructure will be a significant factor for attracting a talented workforce.

Affordable Housing and Climate Crises

There was a third related commentary: Affordable housing & climate crises present opportunity for CNY to lead on page 4.  The author of this commentary was Dara Kovel who is CEO of Beacon Communities.  That Boston based organization claims to be an industry leader in affordable and mixed-income housing development.

She argued that: “The twin challenges of expanding access to affordable housing and combating climate change present a unique opportunity that New York can’t afford to let slip away.”  The commentary was little more than an advertisement for the New York State Energy Research and Development Authority’s (NYSERDA) Carbon Neutral Portfolio Support program that is “working with real estate owners, developers and manufacturers who are willing to take the lead in designing, building and operating low-carbon and carbon-neutral buildings through its Commercial New Construction Program”. 

She argues that this renovation should include existing public housing developments and describes the state program.  She explains:

My company, Beacon Communities, an industry leader in affordable and mixed-income housing development in the Northeast and MidAtlantic, is proud to be the first developer in New York to participate in this program.

Supported by up to $250,000 in state funding, we’re working with Syracuse-based Northeast Green Building Consulting and Ithaca’s Taitem Engineering to review our entire 2.5 million-square-foot New York housing portfolio and design a blueprint to make all existing buildings as clean and resilient as possible while meeting clean energy requirements in new projects.

She concludes:

This is an exciting and critical time for the state and specifically for Central New York. We’re at a tipping point when it comes to both housing needs and climate change, and we should use every tool at our disposal to build the new, green communities of the future. We can’t afford to waste this moment — or this opportunity — to make positive change.

Discussion

I think all three commentaries deserve rebuttals but they don’t deserve much time.  As I noted Kriesel’s characterization of the climate change issue was simplistic and shallow.  Her belief that individuals can make a difference is rebutted by Lomborg.  Lichtenstein claims that readers of the paper should be motivated to support EV infrastructure because it will support the Micron semi-conductor plant proposal.  I find that a stretch.  Moreover, he did not really address the costs to implement the infrastructure required.  Kovel argued that expanding access to affordable housing is important and gloms on to New York’s Climate Act building electrification efforts as a rationale.

Cynic that I am, I note that all three authors have biases in their backgrounds that I think drive their opinions.  Kriesel is a politician and is catering to a particular constituency when she repeats the climate crisis narrative.  The only thing missing was a promise to pass legislation if elected.  Mark Lichtenstein is a professional environmentalist.  His entire career has been devoted to sustainability.  In addition to his role as the executive operating and chief sustainability officer at the SUNY College of Environmental Science and Forestry he is “the founder and principal of Embrace Impatience Associates, and the principal of Lichtenstein Consulting, providing training and consultation on board development, circular economy, communications, conflict management, environmental finance, facilitation, leadership, negotiation, recycling, resiliency, and sustainability.”  Kovel is CEO of Beacon Communities a real-estate developer that is using state money to re-develop its holdings under the guise of disadvantaged community support.  It is entirely appropriate to upgrade affordable housing but I worry that the administrative costs of a Boston-based developer will reduce the amount of money spent on the housing needs.

Conclusion

I was encouraged that I got the opportunity to present my explanation why I believe the ambitions for solar technology will do more harm than good to the readers of the Syracuse Post-Standard.  On the other hand, it was frustrating to read three flawed commentaries the following week.  Because there are restrictions on frequency of guest opinions I could not comment on those flaws.  All three have inherent flaws.  Moreover, the biased opinions of a naïve politician, a professional environmentalist whose career depends on a crisis, and a rent-seeking crony capitalist are evident with a bit of research but I doubt that many readers will take the time.

Expert’s view of solar energy’s potential in NY is far too sunny

I have not published my commentary.  It was based on the post Five Reasons New Yorkers Should Not Embrace a Solar Energy Future and is included here for your information.

The June 12, 2023, commentary “Five reasons New Yorkers should embrace a solar energy future” by Richard Perez, Ph.D., claims to “clarify common misunderstandings about solar energy and demonstrate its potential to provide an abundant, reliable, affordable and environmentally friendly energy future for New York.” I disagree with his reasons.

Perez claim the Earth receives more solar energy than the total annual energy consumption of all economies, combined, in a week but ignores that availability when and where needed is a critical requirement. In New York, the winter solar resource is poor because the days are short, the irradiance is low because the sun is low in the sky, and clouds and snow-covered panels contribute to low solar resource availability.

“Solar technology is improving” is another claimed reason but solar energy in New York is limited because of the latitude and weather so there are limits to the value of technological improvements. If it is so good, then why does deployment rely on direct subsidies?

While solar energy may not have environmental impacts in New York, that does not mean that there are no impacts. Instead. they are moved elsewhere, likely where environmental constraints and social justice concerns are not as strict. The rare earth metals necessary for solar, wind and battery technology require massive amount of mining and the disposal of all the solar panels are significant unconsidered environmental issues.

Perez dismisses land use issues because “a 100% renewable PV/wind future for New York would require less than 1% of the state’s total area.” There is no mandate that solar developments meet the Department of Agriculture and Markets prime farmland protection goal. Projects approved to date have converted 21% of the prime farmland within project areas to unusable land. There is no requirement for utility-scale solar projects to use tracking solar panels, so more panels are required than originally estimated.

Perez claims that “utility-scale solar electricity has become the least expensive form of electricity generation” but that only refers power capacity (MW). When you consider the relative amount of energy that can be produced annually, the storage needed to provide energy when the sun isn’t shining, the shorter life expectancy of PV panels, transmission support service requirements and the need for a new dispatchable, emissions-free resource, then the cost of solar energy provided when and where needed is much higher than conventional sources of electricity.

The suggestion that a system depending on solar energy will be more dependable than the existing system would be laughable if it were not so dangerous. The reliability of the existing electric system has evolved over decades using dispatchable resources with inherent qualities that support the transmission of electric energy. The net-zero electric system will depend upon wind and solar resources hoping they will be available when needed, additional resources to support transmission requirements, and a new resource that is not commercially available. This is a recipe for disaster because if the resource adequacy planning does not correctly estimate the worst-case period of abnormally low wind and solar energy availability then the energy needed to keep the lights on and homes heated will not be available when needed most. People will freeze to death in the dark.