Five Reasons Why the Catastrophic Anthropogenic Global Warming Story is Wrong

Scott Adams (of Dilbert comic fame) recently did a video about climate persuasion titled “Scott Adams solves the climate debate and saves the world (really)”, available here on periscope or here on twitter that has sparked folks to distill their arguments for and against climate change. After hearing about this and faced with the constant barrage of media stories about the latest inevitable climate doom story I thought it would be appropriate for me summarize five reasons why I think that the political agenda to transform the energy system of the world is not supported well enough by sufficient scientific evidence to proceed.

The majority of the claims that anthropogenic carbon dioxide emissions are the cause of the observed warming and that we only have a short time to do something or else are based on projections from global climate models (GCM). As noted below I do have relevant experience, education and background to inform my opinion. However, I believe that anyone who does research based on their personal experience, background and education can reach their own informed opinion of these claims if they actively try to get both sides of the story.

First a bit of background of these models. If you want details, Dr. Judith Curry did a detailed overview that includes a summary description. For my purposes all you need to know is that these models are a variation upon the meteorological models that provide predictions that everyone uses when you make decisions based on weather forecasts. There are differences but they use the same physical relationships such as momentum, conservation of heat and conservation of mass.

I have M. S. and a B. S. degrees in meteorology and in my fourth semester of Weather Analysis & Forecasting the laboratory assignment was to break off into teams and write a simple weather forecast model. I have been inside this kind of model but most readers also have some mathematical background that is relevant. In particular you may remember in algebra that if you have three equations you can solve for three unknowns but if you only have two equations you cannot solve them. The problem in meteorological models is that you have more unknowns than you have equations so model developers have to improvise. In particular, instead of using direct relationships for every single factor that affects weather or climate forecasts, meteorologists use parameters to simulate the effects of some atmospheric processes.

The first reason that I am skeptical of any GCM results is the use of parameters which can be thought of as “fudge factors’.   Model developers necessarily have to account for some things that cannot be modeled directly. John von Neumann allegedly summed up the problem stating that “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”[1]. In other words, he could develop a mathematical model that described an elephant simply by fudging the parameters. Everyone who makes a decision based on a weather forecast has learned that you can trust a forecast for tomorrow better than one several days away. In the 43 years since I graduated the forecasts have become more reliable for dates further in the future because weather forecasters have constant feedback and have been able to adjust the parameters in the meteorological models to improve forecasts based on observations. There is only one global climate system and forecasts made today for 100 years away cannot be checked until 100 years have passed. One insurmountable problem is that the parameters and their use in GCMs cannot be verified as correct in our lifetimes.

Another issue with the parameters is that the focus is on just one of these parameters. Richard Lindzen commented on this:

“Here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.  This is an extraordinary pair of claims based on reasoning that borders on magical thinking.”

My final difficulty with parameters in GCMs is that they are used to model clouds. Dr. Curry explains that in order to solve the physical equations in a global climate the world has to be divided up into a three-dimensional grid. The equations are calculated for each grid cell and repeated to generate a forecast. My particular problem is that the grid cell size needed in order to do these calculations are on the order of 100 km horizontally, the vertical height is often 1 km and they do the calculations every 30 minutes or so. As a result, the models cannot simulate clouds. That single parametrization is a big enough driver of climate that this model component alone could dominate the GCM projections. This uncertainty is well understood in climate science by those who have worked with these models. However, the problems with parameterization is not well understood and its ramifications on the policy decisions is poorly understood by most of those who advocate eliminating fossil fuel use.

My second reason for not trusting these models is related to my experience in air pollution meteorology and work I did as a consultant to EPA evaluating the performance of air quality dispersion models. Complexity-wise those models are orders of magnitude simpler than climate models and they are simple enough to be directly verified. All air quality dispersion models are based on results from field studies that injected tracers into stack effluents, measuring the downwind concentrations in a test array under a wide array of meteorological conditions and then developing coefficients for pollution dispersion. I worked on a project where we compared the known emissions and observed ambient concentrations to model projections at power plants. The results showed that the models in use at the time adequately predicted maximum concentrations so EPA was comfortable that they were working correctly. The frightening result in my mind is that it was not uncommon for the model to predict a maximum concentration close to the maximum observed value but the meteorological conditions for the predicted maximum would be different than the meteorological conditions for the observed maximum.

Consider the differences between the GCMs and air quality models. Air quality models use parameters that are based directly on observations, incorporate known emissions, and have been extensively verified in field studies but those models could get the right answers for the wrong reasons. GCMs use parameters that are based on model developer opinions, have to estimate future emissions, and cannot be tested in the climate system. We are supposed to believe the models are getting the right answers for right reasons. I don’t think that is a reasonable assumption for modeling that is the basis for converting the entire economy away from fossil fuels.

My third reason for not accepting the commonly held belief that catastrophe is inevitable is that the projections that make that claim are only one answer in a wide range of potential outcomes. Consider for example that the Intergovernmental Panel on Climate Change (IPCC) does not give a single value for the sensitivity of atmospheric temperature to carbon dioxide. Instead they give a range of potential warming for a doubling of atmospheric concentrations of carbon dioxide of between 1.5 and 4.5 degrees C. Nic Lewis does a nice job discussion climate sensitivity here.  The GCM projections cover a wide range of potential outcomes from benign to catastrophic. Without going too deep I want to point out that the damage claims for increased carbon dioxide depend on the shape of the distribution of this sensitivity. Damage affects society when costs are greater and those estimates are strongly affected by the probability of extreme damages. The first problematic aspect of this issue is that the projections for high impacts rely on a relatively high probability of extreme impacts. Although recent research has shown that the likelihood of extreme outcomes is lower than previously thought, those results have not been incorporated into the damage estimates. If they were considered, then the costs currently claimed would be reduced.

More problematic to me than the range of possible model outcomes is the use of the worst-case representative concentration pathway as the business as usual scenario. The IPCC developed a set of four pathways to concentrations representing the range of radiative forcing (i.e. greenhouse effect) in the literature for 2100. The range of possible future atmospheric forcing levels runs from relatively low levels of carbon dioxide to the highest forcing they thought possible. The problem is that these concentrations had to be related back to emission scenarios and the worst case representative concentration pathway with a forcing of 8.5 watts per meter squared is so high that the emission scenario necessary to get that level is not credible. Many of the really scary projections that get the headlines and dominate the narrative why we need to reduce carbon dioxide emissions use RCP 8.5 as business as usual. We already know that the likelihood of that future emission scenario is extremely unlikely. When coupled with models that give a range of outcomes from benign to problematic, I can only conclude that while it is not impossible that there could a catastrophic impact the probability is so low that it should not drive policy decisions.

My fourth reason for not trusting the models that claim carbon dioxide is the primary driver of the recently observed warming is that even for the limited results we have from the models that can be compared to climatic system they don’t do well. Dr. Curry has explained that inconsistency well:

Between 1910 and 1940, the planet warmed during a climatic episode that resembles our own, down to the degree. The warming can’t be blamed on industry, she argues, because back then, most of the carbon-dioxide emissions from burning fossil fuels were small. In fact, Curry says, “almost half of the warming observed in the twentieth century came about in the first half of the century, before carbon-dioxide emissions became large.” Natural factors thus had to be the cause. None of the climate models used by scientists now working for the United Nations can explain this older trend. Nor can these models explain why the climate suddenly cooled between 1950 and 1970, giving rise to widespread warnings about the onset of a new ice age.

The final reason that I believe that the political agenda to transform the energy system of the world is not supported by sufficient evidence to be credible is the suggestion that climate change is easy to solve because renewables are a viable solution. As convinced as I am that the climate science does not support this agenda I believe the suggestion that wind and solar can solve our energy problems is even more of an exaggeration. In fact that issue is the primary driver why I blog and have written so much about New York’s climate change plans. However, don’t listen to me, listen to Bill Gates who states “The idea that we have the current tools and it’s just because these utility people are evil people and if we could just beat on them and put (solar panels) on our rooftop—that is more of a block than climate denial,” Gates said. “The ‘climate is easy to solve’ group is our biggest problem.” Another problem is that the renewable “solution” very likely has very significant environmental impacts that are generally ignored. Michael Shellenberger had a Ted talk “Why renewables can’t save the planet” that addresses this issue.

To sum up. The rationale used to justify the need to convert the energy system of the world is that carbon dioxide will cause inevitable catastrophe and we can be saved if only we implement renewable wind and solar which will be easy to do. I don’t believe the science supports inevitable catastrophe because those projections are based on global climate models. Those models use too many fudge factors that can give too many results that can never be tested, much simpler models that are based entirely on observations can give the right answer for the wrong reason and the model results to date do not adequately predict the one climate experiment we can test. Most of the catastrophic outcomes that dominate the political and media narrative depend on an emissions scenario that is not credible. I do not believe that diffuse and intermittent solar and wind can be used to replace reliable and affordable electric power much less generate enough energy to convert transportation, heating, and industrial use of fossil fuel to electricity.

[1] Attributed to von Neumann by Enrico Fermi, as quoted by Freeman Dyson in “A meeting with Enrico Fermi” in Nature 427 (22 January 2004) p. 297


Recommended Read: Global Warming for the Two Cultures

I have the education, background and experience to independently evaluate the constant drum beat claiming imminent and inevitable climate catastrophe if we don’t immediately reduce our carbon footprint. I am a luke-warmer who believes that the sensitivity of climate to anthropogenic carbon dioxide emissions is at the bottom of the Intergovernmental Panel on Climate Change range. At that level, climate catastrophe is a very unlikely possibility and the effect is much more likely to be benign.

Unfortunately it is very frustrating to hold my position because the media, politicians and advocacy groups have convinced many that we have to use renewables as a “solution” to what I think is a non-existent problem. As a result I am always looking for a good summary of the issues that I have with the imminent climate catastrophe narrative. The 2018 Global Warming Policy Foundation Annual Lecture: “Global Warming for the Two Cultures” by Dr. Richard Lindzen is an excellent summary that I recommend to those who believe that we need to transform the energy system to do “something” about climate change so that they will have at least heard the other side of the story.

Lindzen begins his talk by describing two cultures in society and the implication of that on policy decisions. Basically the two cultures are those that understand the “science” in general and physics in particular and those that don’t. He explains why this understanding gap is a problem:

While some might maintain that ignorance of physics does not impact political ability, it most certainly impacts the ability of non-scientific politicians to deal with nominally science-based issues. The gap in understanding is also an invitation to malicious exploitation. Given the democratic necessity for non-scientists to take positions on scientific problems, belief and faith inevitably replace understanding, though trivially oversimplified false narratives serve to reassure the non-scientists that they are not totally without scientific ‘understanding.’ The issue of global warming offers numerous examples of all of this.

One of my problems with the media climate change story is that the greenhouse effect is simple. His lecture describes the complicated climate system in enough detail to support my contention that the inevitable climate catastrophe is imminent story is an over-exaggeration.

I particularly like his description of the popular narrative we hear from the media and politicians:

Now here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.

This is an extraordinary pair of claims based on reasoning that borders on magical thinking. It is, however, the narrative that has been widely accepted, even among many sceptics.

He then goes on to describe how he believes the popular narrative originated and de-bunks the evidence we constantly reminded supports the catastrophic narrative.

I encourage you to read the entire lecture. I believe it supports his concluding summary of the situation:

An implausible conjecture backed by false evidence and repeated incessantly has become politically correct ‘knowledge,’ and is used to promote the overturn of industrial civilization.

Temperature Related Deaths

Environmental advocates claim heat results in more deaths than any other weather-related event but I recently read a conflicting claim about weather-related deaths. The New York City Environmental Justice Alliance released a new report, NYC Climate Justice Agenda 2018 – Midway to 2030: Building Resiliency and Equity for a Just Transition, that claims “Extreme heat results in more deaths than any other weather-related event”. On the other hand, a study in Lancet, “Mortality risk attributable to high and low ambient temperature: a multicountry observational study”, notes that “most of the temperature-related mortality burden was attributable to the contribution of cold”. I did some research and now I think I know what is going on for these two differing claims.

The NYC Climate Justice Agenda bases their claim that extreme heat causes more deaths than cold based on an EPA reference. The EPA extreme heat webpage uses data from the National Oceanic and Atmospheric Administration Natural Hazard Statistics: Weather Fatalities website. The home page for that site lists 508 fatalities from all weather events in 2017, including 107 from extreme heat, 26 from extreme cold, 10 from winter storms, 1 from ice, and 3 from avalanches. Those data that show that the more people died due to extreme heat than other cause, narrowly beating out flash floods, and that more people die from heat than cold-related events. The data for the website are compiled from information in the National Weather Service (NWS) storm events database.

The Global Warming Policy Foundation April 9 2018 newsletter reported that 48,000 Britons died this winter due to cold weather.   Those numbers are obviously far different than the NWS data. The Lancet paper by Gasparrini et al. notes that:

Although consensus exists among researchers that both extremely cold and extremely hot temperatures affect health, their relative importance is a matter of current debate and other details of the association remain unexplored. For example, little is known about the optimum temperatures that correspond to minimum effects for various health outcomes. Furthermore, most research has focused on extreme events and no studies have comparatively assessed the contribution of moderately high and low temperatures. The underlying physiopathological mechanisms that link exposure to non-optimum temperature and mortality risk have not been completely elucidated. Heat stroke on hot days and hypothermia on cold days only account for small proportions of excess deaths. High and low temperatures have been associated with increased risk for a wide range of cardiovascular, respiratory, and other causes, suggesting the existence of multiple biological pathways.

I believe that the reason for the difference in the two conclusions is explained by this statement by Gasparrini et al.: “The dose-response association, which is inherently non-linear, is also characterised by different lag periods for heat and cold—i.e., excess risk caused by heat is typically immediate and occurs within a few days, while the effects of cold have been reported to last up to 3 or 4 weeks.”

According to the NWS instructions for storm data preparation the storm data report documents:

  • The occurrence of storms and other significant weather phenomena having sufficient intensity to cause loss of life, injuries, significant property damage, and/or disruption to commerce;
  • Rare, unusual, weather phenomena that generate media attention, such as snow flurries in South Florida or the San Diego coastal area; and
  • Other significant meteorological events, such as record maximum or minimum temperatures or precipitation that occur in connection with another event.

The key point is that the storm data report makes a distinction between direct and indirect deaths.  Only direct deaths are tabulated when a local weather office prepares the storm report. For example, in winter storms deaths from heart attacks from shoveling snow are indirect.  If a person wanders outside and freezes to death that’s a direct death. Furthermore, while indirect deaths are included in the storm narratives the numbers are not included in the tabulated data and storm reports are prepared within days of the event so any indirect deaths due to excessive cold caused by weeks-old impacts would not be included. Details on the difference between direct and indirect deaths are found in the instruction document on pages 9 to 12.

In their study of Gasparrini et al. found that temperature is responsible for 7.7% of mortality. Cold was responsible for “most of the burden”. Although in the study over 90% was attributed to cold the paper noted that “This difference was mainly caused by the high minimum-mortality percentile, with most of the mean daily temperatures being lower than the optimum value”. I interpret that to mean that some of the difference was due their classification methodology. In line with the indirect death distinction it is notable that over 85% of the mortality attributable to temperature was related to moderate cold. Offhand I think there must be more causes of death associated with freezing weather than hot weather. For example, auto accidents on icy roads has to cause more deaths than any hot weather impact on travel.

In conclusion, there is a data base that does show that extreme heat results in more deaths than any other weather-related event. However, the database used to justify that claim only includes direct deaths. An epidemiological study that does include indirect deaths concludes the majority of deaths are associated with moderate cold weather.

Relative to climate change policy the distinction between heat and cold is important. If the argument is that we must mitigate human impacts on climate to reduce mortality due to temperature than because a warming climate will result in less moderate cold then that means warming will have a beneficial effect. An unintended consequence of climate change mitigation through the implementation of renewable energy is the universal increase in cost. Given the impacts on indirect deaths I believe that increased heating cost will adversely affect mortality if low income people cannot afford to keep their homes warm enough to prevent potential health impacts of cold weather. Finally, the fact is that climate is a reason many more people move to Phoenix AR than move to the “ice box of the nation”, International Falls, MN, suggests we are better able to adapt to warm than cold.

Climate change soon to be main cause of heat waves in West, Great Lakes

A recent study entitled Early emergence of anthropogenically forced heat waves in the western United States and Great Lakes was publicized in the Syracuse New York Post Standard under the headline Upstate NY among first to have most heat waves due to climate change. Unfortunately, as Blair King writes “it actually represents a quite excellent example of how science is misrepresented to the public in the climate change debate.”

According to a press release: “Lopez and colleagues used climate models along with historical climate data to project future heat wave patterns. They based their findings on the projection for greenhouse gas emissions this century, known as the RCP8.5 scenario. This assumes high population with modest rates of technological change and energy conservation improvements and is often called the “business as usual” scenario. Lopez said he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.”

My concern and that of Blair King is the use of the RCP8.5 scenario. This is a representative concentration pathway that represents a forcing of 8.5 watts per meter squared that is used by climate modelers to represent the worst case atmospheric effect of greenhouse gases by 2100. Essentially this emissions scenario was developed to provide that forcing level.

Larry Kummer looked at the scenario in detail. He notes that “It assumes the fastest population growth (a doubling of Earth’s population to 12 billion), the lowest rate of technology development, slow GDP growth, a massive increase in world poverty, plus high energy use and emissions.” His post explains that RP8.5 assumes population growth at the high end of the current UN forecasts, assumes that the centuries long progress of technology will slow, and assumes no decarbonization of world power sources from new technology (e.g., solar, wind, fission, fusion) or regulations to reduce not just climate change but also air pollution and toxic waste.

Blair King explains that RCP8.5 has a storyline that describes the assumptions of the scenario in easy to understand language. He goes on to explain that the RCP8.5 scenario dates back to 2007 and is characterized by the following:

  • Lower trade flows, relatively slow capital stock turnover, and slower technological change;
  • Less international cooperation than the A1 or B1 worlds. People, ideas, and capital are less mobile so that technology diffuses more slowly than in the other scenario families;
  • International disparities in productivity, and hence income per capita, are largely maintained or increased in absolute terms;
  • Development of renewable energy technologies are delayed and are not shared widely between trade blocs;
  • Delayed land use improvements for agriculture resulting in increased pollution and increased negative land use emissions until very late in the scenario (close to 2100);
  • A rebound in human population demographics resulting in human population of 15 billion in 2100; and
  • A 10 fold increase in the use of coal as a power source and a move away from natural gas as an energy source.

Consider those assumptions against what actually has happened since 2007. I am not sure about the status of international disparities in productivity and land use improvements. However, I believe all the other parameters are not following those assumptions. Global trade is at all time highs, renewable technology is freely traded, renewable technology continues to mature and develop, and there is no sign of human population growth accelerating to reach 15 billion. Most importantly, this scenario pre-dates the fracking revolution that has flipped the use of coal and natural gas in the United States by making natural gas so cheap and plentiful. There is no reason to believe that the technology won’t expand elsewhere and markedly reduce any potential increase in the use of coal as a power source.

Lopez states that “he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.” He is either ignorant of the substantial change in greenhouse gas emissions observed in the United States or willfully ignored those numbers to misrepresent the science to the public.

Smithsonian Capture the Sun Harness the Wind

I am so tired of the Smithsonian’s unquestioning devotion of renewable energy in spite of obvious warning signs that I wrote a letter to the editor. In the April 2018 Smithsonian there is an article entitled “The Future’s so Bright (He’s Gotta Wear Shades)” by Dan Solomon and a related graphic article “Capture the Sun, Harness the Wind” by 5W Infographics. These articles are essentially puff publicity pieces for the renewable energy industry that clearly shows the bias in Smithsonian on renewable energy. Nonetheless they cannot escape inconvenient facts.

The most obvious problem with Solomon’s article on the bright future of renewable energy in Georgetown Texas is that renewable energy looks great for the early adopters but the reality of a 100% reliable electric system lies beneath that success. The situation is exactly the same as a pyramid scheme where the first ones in reap benefits. When Solomon’s article notes that “about 2% of the time the Georgetown utility draws electricity derived from fossil fuels”, an unbiased article would have followed up on the implications of that. The primary support for the fossil fuels necessary to keep Georgetown lights on comes from everybody else. As more rent-seekers pile onto the renewable energy bandwagon pyramid the costs necessarily increase for those on the outside. As you dig deeper it becomes apparent that price support for the rest of the electric system not only becomes more likely but because solar and wind don’t support grid services it becomes increasingly likely that another layer of support has to be added at some point over 30% renewable penetration. At this time Georgetown is not paying for that.

The first graph in the graphic article shows “The comparison to coal” which charts the 2016 actual coal and renewable sources electricity generation and projects changes in their use out to 2050. Comparing the 2016 coal use of 1,216 billion kwhr and 573 billion kwhr renewable source estimate of 573 with EIA numbers shows that those numbers are close enough to not quibble. However, the title of the article refers to sun and wind and the electricity generation in those categories is lumped together with hydro, biomass, geothermal, and other gases. As far as I can tell solar and wind account for less than half of the 573 billion kwhr number. On the other hand most of the future renewable growth will occur in the wind and solar sectors but the graphic does not provide that information. Neither article mentions just how much wind and solar generation will be needed to meet the projected 2050 number.

Another graphic notes that 800 MW of energy storage were built in the United States in the last five years and expects that amount will be built in just 2020. The important number is how many MW hours will be available from the energy storage built because that defines how much storage will be available to counteract renewable’s intermittency. Solomon’s article also did not address how much storage would be needed for Georgetown to get off the grid. Neglecting to point out that because intermittent renewables struggle to generate power over a third of the time we will likely have to over-build renewable capacity and add massive amounts of energy storage biases the renewable argument.

One of the inconvenient facts illustrated but not noted in the graphic article is jobs per energy produced. If you divide the number of coal-industry employees in 2016 into the total coal generation you get 24.3 million kWh produced per employee. If you divide the sum of the solar and wind employees in 2016 into half of the reported renewable sources generation you get 0.8 million kWh produced per employee. Coal is 30 times more man-power efficient. While that may be good for employment it does not portend well for cost.

Other than the fact that the duck curve is graphically interesting I am not sure why that was included in the graphics article. More importantly it illustrates a problem. When you have large amounts of solar on the system something has to be available to make up for the evening demand. That is where storage becomes necessary. In order to keep the lights on you also need enough storage to cover those days when there isn’t any sun. Dale Ross’s flippant we are in West Texas so “Cloudy, Really?” comment aside a quick check of the climatological data indicates that it is mostly cloudy 28% of the time in Georgetown. Obviously despite the claim that Georgetown is powered entirely by renewable energy the fact is that is not true.

The Solomon article has multiple instances of conveniently neglected facts to make the story. It notes that the City was able to get guaranteed rates of 20 years for wind power and 25 years for solar power. It would have been appropriate to note that these types of facilities have very little operational experience that long so the guarantees might not be as iron-clad as implied. Solomon quotes Adam Schultz as saying that solar and wind have gotten so much cheaper that “I can’t even tell you the costs because costs have been dropping so rapidly”. If that is the case then why do both solar and wind need direct subsidies? Finally, blowing off the environmental impact of renewables on birds by saying that more birds are killed by cats and buildings reminds me of the two wrongs don’t make a right parable. Furthermore, what about the bats and just how many raptors are killed by house cats? The fact is that because renewable energy is diffuse, wildlife issues are a legitimate concern.

Those are the superficial errors illustrating biases. The reality is that because wind and solar are diffuse the electric grid is essential for keeping the lights on. Digging down into this problem is more complicated but necessary for the complete unbiased story of renewables. I recommend this post by Planning Engineer at the Climate Etc. blog for an overview of the transmission planning difficulties raised by wind and solar energy. In brief, the modern power grid is a connected complex machine that balances various electrical and mechanical properties over thousands of miles. The system must be stable that is to say stay in synchronism, balance loads and generation and maintain voltages following system disturbances. The grid is built upon heavy rotating machinery at hydro, nuclear, and fossil-fired generating stations that provides that stability. Wind and solar power do not provide any stability support. Up to some point the present day grid is resilient enough to overcome that problem but at some point it has to be addressed. I don’t doubt that it can be addressed successfully but the costs necessary to do that are unknown and were certainly not a consideration in either article.

The reality of solar and wind renewable power not addressed in this article is that it is likely only to completely supplant fossil fuels in limited locations where both solar and wind potential are high, industrial load requirements are negligible, and the weather is mild in the winter because both resources are intermittent and diffuse. Texas has large wind and solar resources because of geography and because it is so large there is enough space to generate significant amounts. Georgetown TX does not have heavy industry that requires high amounts of electricity constantly so they can pretend to be powered entirely be renewable energy. Finally, Georgetown does not have to contend with winter impacts of higher latitudes particularly home heating. The solar resource is much reduced simply because the days are shorter but you must also consider reductions due to snow covered rooftop solar cells.

News from NY Office of Climate Change

The New York State Department of Environmental Conservation (DEC) Office of Climate Change publishes a regular email that lists the latest climate news. The latest edition shows that news has to be consistent with their preconceived notions of global warming. In this edition they use information to prove their case for climate change problems at the same time as they claim similar information cannot be used to not suggest climate change is not a problem. Talk about trying to have their cake and not eating it too.

Before proceeding a disclaimer. Before retirement from the electric generating industry, I was actively analyzing air quality regulations that could affect company operations. The opinions expressed in this post do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

There are two articles that show the inability of the Office of Climate Change to really understand that there are two sides to the issue of climate change. The lead article is a picture of extensive ice at Niagara Falls with the following caption: “Extreme cold at the end of 2017 has frozen all but the moving water at Niagara Falls. Recent research suggests that a warming arctic may be contributing to cold snaps like this one in the Northeastern U.S. as a result of a weakened polar vortex.”

Also included, under a title “Science” is a quote from A Response for People Using Record Cold U.S. Weather to Refute Climate Change, published December 28, 2017 on

“Weekly or daily weather patterns tell you nothing about longer-term climate change (and that goes for the warm days too). Climate is defined as the statistical properties of the atmosphere: averages, extremes, frequency of occurrence, deviations from normal, and so forth. The clothes that you have on today do not describe what you have in your closet but rather how you dressed for today’s weather. In reality, your closest is likely packed with coats, swimsuits, t-shirts, rain boots, and gloves. In other words, what’s in your closet is a representation of ‘climate.’”

I agree completely that weekly or daily weather patterns are no indicator of longer-term climate change. If it is not immediately obvious the “recent research” analysis about the cold weather is trying to make an argument about weather patterns as an indicator of longer-term climate change. I am sorry but you cannot have it both ways.

If the Office of Climate Change deigns to correct this that might also want to mention to the Governor that he consistently is guilty of the same thing. He consistently refers to Superstorm Sandy as devastation related to climate change and has mentioned the November 2014 Buffalo lake effect snowstorm as further proof. Both were caused by short-term weather patterns. In order to prove otherwise historical weather patterns would have to be evaluated to determine if there was a change over time. In my opinion running a climate model to claim causation is dubious at best.

Great Lakes Vineyard Confronts Climate Change

There are two aspects of the recent presentation Great Lakes Vineyard Confronts Climate Change that need to be considered: scare mongering by anecdote and Bandolini’s BS principle. It is a sad commentary on the media today that this presentation had so little substance other than anecdotal “evidence” that climate change is adversely affecting vineyards in the Great Lakes. Showing that the presumptions in the presentation are weak is a perfect example of Alberto Brandolini’s BS principle: “The amount of energy necessary to refute BS is an order of magnitude bigger than to produce it.”

Anecdotal Evidence

Angelica A. Morrison’s newscast claims that “Problems from disease, like powdery mildew, and pests arise when temperatures extremes become the new way of life.” Interviewing a farmer who shows her some diseased plants purportedly shows the effects of the climate change in western New York. The evidence for extreme weather change is the farmer’s recollection: “We’ve had a very mild winter [in 2016] so almost everything survived,” he said. “But prior to that, the winter of 2014 to 2015, were extremely cold temperatures that I’ve never seen before. “And it killed a lot of vineyards that in the past we’ve had success with. We’ve done a lot of replanting and we try to choose varieties that can survive the winter.”

The presentation explains that the vineyard in question is in the Lake Erie Concord Grape Belt, which starts in western New York and extends to Pennsylvania and goes on to note that the area depends on Lake Erie to moderate temperatures. “The lake is supposed to be our great protector,” says Tim Weigle of the Cornell Cooperative Extension’s Lake Erie Regional Grape Program. The presentation notes that “Weigle, who advises grape farmers and works with them on managing their crops, says the lake doesn’t freeze over like it used to. When temperatures are prematurely warm, crops come out of dormancy, making them vulnerable to frost. ‘If the lake freezes then we don’t have those problems, but since it hasn’t been freezing all the time, we have run into more problems with frost and freezes,’ he says.”


There are two claims in this presentation. The first is that in the winters of 2014 and 2015 there were extremely cold temperatures that the farmer has “never seen before”. The second is that when Lake Erie freezes over temperatures don’t warm up prematurely so crops are not damaged coming out of dormancy before the last killing frost of the season.


In order to prove or refute the claims in this presentation complications immediately arise. There is no question that there is a warming trend in this region but what causes winter damage in the first claim? If damage occurs because of the lowest temperature of the year that can be checked easily but if it is the duration or number of days below some threshold temperature, the analysis gets more complicated quickly. For the second claim, if the problem is a period of temperatures so warm and so long that dormancy is broken followed by a killing frost the trends analysis for that is even more complicated.

The second claim confuses me. In particular, this statement “If the lake freezes then we don’t have those problems, but since it hasn’t been freezing all the time, we have run into more problems with frost and freezes”. Lake Erie moderates air temperatures because the seasonal lake temperature lags behind the seasonal air temperature. As a result, in the fall frosts don’t occur as early because the warmer lake tempers the freezing air. In the spring there is moderation for warming and cooling. The lake is generally cooler and slows the plants coming out of dormancy but also protects them if a cold snap comes along because its temperature is above freezing. My problem with the presentation statement is that those effects are eliminated when Lake Erie freezes over. When Lake Erie freezes over downwind air temperatures are not moderated by a source of above freezing water and as a result temperatures are not moderated and, most visibly, the Lake Erie lake-effect snow machine is cut off. Therefore, the moderating effect on frosts and freezes should be enhanced if the lake does not freeze over.


The New York Climate Change Science Clearinghouse is described by its supporters as “a regional gateway to data and information relevant to climate change adaptation and mitigation across New York State. It provides climate science data and literature and other resources for policy-makers, practitioners, and the public, to support scientifically sound and cost-effective decision making”. I tend to be a little more cynical about its contents because it is biased towards alarmism. However, it does provide anyone with easy access to relevant climate data.

The Climate Data Grapher – Station Temperature includes annual average minimum temperature for Fredonia, NY in addition to the other following parameters:

  • Daily maximum temperature (F)
  • Daily minimum temperature (F)
  • Daily average temperature (F)
  • Growing degree day accumulation, base 50 F
  • Heating degree day accumulation, base 65 F
  • Cooling degree day accumulation, base 65 F
  • Counts of days with max temperature above 90 F
  • Counts of days with max temperature above 95 F
  • Counts of days with max temperature above 100 F
  • Counts of days with min temperature below 0 F
  • Counts of days with min temperature below 32 F
  • Growing season length (days)

All of these parameters show what we would expect in a warming climate: daily minimum, maximum, and average temperatures are increasing, cooling degree days, growing degree days and growing season length are increasing, heating degree days are decreasing, the counts of warm days are increasing and cool days are decreasing.

With respect to claim number one that during the winters of 2014 and 2015 there were extremely cold temperatures that the farmer has “never seen before”, we can check the claim by looking at the count of number of days below 0 F. Unfortunately the Fredonia monitoring site stopped operating in 2011 so I used the nearby Buffalo airport site. In 2014 there were six days of below zero temperatures and in 2015 there were 12. In 1979 there were 11 days and looking back there is nothing that unusual about six days that suggests “never seen before” is verified. In fact between 1976 and 1985 there was only one year that was below six days.

Unfortunately, none of the parameters on the Climate Data Grapher can be used to necessarily support or refute the second claim about ice cover and dormancy. A graph of annual maximum ice cover for Lake Erie (available from the NOAA Great Lakes Environmental Research Lab) does support the claim that the lake does not freeze over as much as in the past but as is the case with readily available temperatures it may be the duration and timing of ice cover that affect crop dormancy.

As I explained above I don’t think ice cover affects dormancy but to determine if there is a trend in dormancy that analysis is a bigger deal than I can handle. First you would have to determine the conditions that break dormancy: temperature and duration of temperature above some threshold. If the potential effect is exacerbated by frozen ground that has to be included.  Daily maximum and minimum temperature data are readily available but you would need to develop a program to analyze that data to determine the annual end of dormancy and the date of the last killing frost. If you can show that the end of dormancy is coming earlier in the year and the date of last killing frost is not also coming earlier that would support the claim. If the date of the last killing frost is also coming earlier then that would not support the claims. More importantly, would be to see how often a late frost caused problems with plants historically.


This presentation illustrates problems with the media relative to climate change reporting. This furthers the narrative that climate change effects are happening now for the public who has neither the time nor expertise to evaluate the claims. I heard it on NPR – it must be true.   It is a sad commentary on the media that this presentation had so little substance other than anecdotal “evidence” that climate change is adversely affecting vineyards in the Great Lakes. On the other hand if anyone wants illustrations of two of my pragmatic environmentalist principles it offers vivid examples. Clearly this is a sound bite environmental news report and refuting its baloney took at least an order of magnitude more work. If you wanted to support or refute the dormancy claim it would be another order of magnitude of effort.

Real Climate and Cost Effectiveness

Update 9/1/2017: My responses showed up on Real Climate.  Not sure why I could not see them but the timing on the posts indicates they were there but I could not see them?!?

On August 28 I read a post called Sensible Questions on Climate Sensitivity on the Real Climate blog. In the comment section there was a comment that, in my opinion, mis-characterized the position of the luke-warmers I know so I responded. I have heard that comments are censored sat this site o that those who do not comply with the positions of the web masters are not shown but have never had personal experience with it. Although my first comment was posted my responses to the follow up comments were not. Because my original post included the link to this blog I am posting the comments and my responses. If you wanted a response to your comments and checked my blog here you go. If the comments show up then I will delete this post.

The relevant comments are shown below. The particular comment (#25) that engendered my response claimed that luke-warmers insist “that published ECS confidence limits can only mean the most cost-effective public policy is to do nothing.” First an explanation then the comments and finally a conclusion.

I consider myself a luke-warmer. Luke-warmers are not a well-defined party in the global warming debate. My definition is that we simply believe that the ECS or equilibrium climate sensitivity (the amount of warming caused by greenhouse gases) is at the lower end of the Intergovernmental Panel on Climate Change range. As soon as you get to policy responses luke-warmers diverge but those who also claim to be luke warmers (e.g., Tom Fuller, Blair King and Judith Curry) all don’t think public policy should be to do nothing.

I don’t want to speak for their particular public policy opinions so I will give my preference for public policy. I don’t think the climate is so sensitive to GHG that we have to use current renewable technology in order to stave off catastrophe. I don’t think we can ever be so sure of future climate to say it is not a problem or it will be a catastrophe. Therefore I am convinced that we have to develop cheaper low carbon technology because as long as fossil fuels are cheaper they will be used. Fossil fuels are the best thing that ever happened to mankind because without them our lives would be brutal and short. Until we have a replacement that can provide the benefits of abundant and affordable power then it is immoral to not use fossil fuel. In the meantime because society is not resilient to current extreme weather I think that in addition to funding research for cheaper fossil fuel alternatives we should be spending money on adapting to extreme weather rather than subsidizing any current technology renewable energy.


Here are the blog comments. I responded to comment 25 in comment 29. Comments 30, 31, 32, and 33 were posted in response to my comment. I submitted two responses that went to moderation. I copied the text and pasted it into a document for archival. Several hours later they disappeared and were not posted.

25   Mal Adapted says:

22 Aug 2017 at 9:35 AM


Interpretation: There is enough uncertainty that a little humility need apply.

Scientists, like all genuine skeptics, are required to be humble before Nature. It’s necessary, though not sufficient, to not fooling themselves. Pseudo-skeptical AGW deniers, OTOH, who keep saying “it’s not happening”, “it’s not our fault”, “it won’t be bad” or “We’ll be lucky” are letting hubris fool them.

That last AGW-denier meme has been labeled ‘luck-warmerism’. Luckwarmers selectively mask the upper half of the ECS PDF, while falsely accusing climate realists of masking the lower half. The luckwarmer insists that published ECS confidence limits can only mean the most cost-effective public policy is to do nothing.

How about you, Dan? Are there limits to your confidence?

29   Roger Caiazza says:

28 Aug 2017 at 11:20 PM

Mal Adapted, Two questions relative to “The luckwarmer insists that published ECS confidence limits can only mean the most cost-effective public policy is to do nothing.”

This Luke warmer thinks that the trend in ECS confidence limits is shrinking the fat tail. Is that what you mean by masking the upper half?

However the science ends up, this Luke warmer thinks that adaptation is a more cost-effective public policy than mitigation. Do you think that today’s technology is capable of mitigating our way out of GHG forcing climate change?

30   zebra says:

29 Aug 2017 at 9:36 AM

Roger Caiazza #29,

What exactly does “cost effective” mean in this context?

I’m always hearing this sky-is-falling claim about the “economic disaster” that is supposed to result from doing things like installing solar panels or building wind farms or driving EV or reducing energy consumption, and so on.

But, I never see any numbers! And, I never see any cause and effect!

If you want to deal with fat tails, let’s do it on both sides of the discussion. Why should anyone give any credibility to your alarmist position, given that you don’t have any basis for it other than your opinion?

31   MartinJB says:

29 Aug 2017 at 12:59 PM

Roger (@19): SO, “however the science ends up” you think that adaptation is cost effective. Interesting… that suggests that you have a preconceived notion of what you’re willing to do that is not dependent on the reality of the situation. I will freely admit that at a low enough level of ECS, adaptation is more cost effective. But surely, the higher the ECS the more relatively cost-effective mitigation becomes. What’s more, mitigation is more likely to cut off some of the tail risk than adaptation.

One other thing to consider: In general, efforts at mitigation do a better job of assigning more of the costs of dealing with global warming to the the people who have contributed more to in-the-pipeline global warming. Admittedly, that is not a strictly economic metric and likely puts more cost on me and mine, but I am happy to give up a little for the purpose of justice and equity…

32   Mal Adapted says:

29 Aug 2017 at 6:16 PM

Roger Caiazza:

However the science ends up, this Luke warmer thinks that adaptation is a more cost-effective public policy than mitigation.

Uh…if you don’t care about the science, what makes you think adaptation is a more cost-effective public policy than mitigation? Stipulating, of course, that adaptation might be a more cost-effective private policy for you, even if climate sensitivity ends up to be above the modal estimate. If you’re lucky, that is.

33   Phil Scadden says:

29 Aug 2017 at 7:50 PM

Roger – your opinion based on hope, preference – or some actual peer-reviewed analysis of numbers that you would like to share with us? Link please.

Roger Caiazza says:

Your comment is awaiting moderation.

29 Aug 2017 at 8:43 PM

Zebra, Here is a cost estimate for New York State to meet part of Governor Cuomo’s Executive Order reaffirming the state policy to reduce greenhouse gas emissions by forty percent by 2030, and eighty percent by 2050 from 1990 levels, across all emitting activities of the New York economy. The Manhattan Institute recently published “New York’s Clean Energy Programs, The High Cost of Symbolic Environmentalism” ( by economist Jonathan Lesser that provides cost estimates for some of the programs referenced in the Executive Order.

Here are the key findings: Given existing technology, the CES’s 80 by 50 mandate is unrealistic, unobtainable, and unaffordable. Attempting to meet the mandate could easily cost New York consumers and businesses more than $1 trillion by 2050.

The CES mandate will require electrifying most of New York’s transportation, commercial, and industrial sectors. (In 2014, for example, fossil-fuel energy used for transportation was twice as large as all end-use electricity consumption combined.) Even with enormous gains in energy efficiency, the mandate would require installing at least 100,000 megawatts (MW) of offshore wind generation, or 150,000 MW of onshore wind generation, or 300,000 MW of solar photovoltaic (PV) capacity by 2050. By comparison, in 2015, about 11,300 MW of new solar PV capacity was installed in the entire U.S. Moreover, meeting the CES mandate likely would require installing at least 200,000 MW of battery storage to compensate for wind and solar’s inherent intermittency.

Meeting the CES interim goals—building 2,400 MW of offshore wind capacity and 7,300 MW of solar PV capacity by 2030—could result in New Yorkers paying more than $18 billion in above-market costs for their electricity between now and then. By 2050, the above-market costs associated with meeting those interim goals could increase to $93 billion. It will also require building at least 1,000 miles of new high-voltage transmission facilities to move electricity from upstate wind and solar projects to downstate consumers. No state agency has estimated the environmental and economic costs of this new infrastructure.

For what it is worth I think his estimates don’t include all the costs.

Roger Caiazza says:

Your comment is awaiting moderation.

29 Aug 2017 at 9:03 PM

For the cost effective commenters,

Using your science for the ECS and the New York State Energy Research and Development Authority numbers referenced in the Manhattan Institute report referenced above what do you think the change in global warming temperature would be? For New York the reduction would be 76.2 MMtCO2e from 2014 levels for the 2030 goal and 170.6 MMtCO2e from 2014 for the 2050 goal. For costs use just the $18 billion in above market electricity costs. My question to you is the money spent on the mitigation reduction that you predict going to have tangible results? However I do that calculation I don’t see a measurable impact on temperature. If you have a different approach suitably referenced in the peer reviewed literature please show me.

On the other hand spending that money on adapting New York would provide tangible benefits by making the state more resilient to extreme weather. Remember global warming is going to increase the probability of extreme weather and make it more severe. It is not going to prevent the extreme weather we have observed in the past and, in my opinion at least, we are not nearly as resilient to historical weather as we need to be. So my cost effective argument against mitigation is a lot of money spent for little effect might better be spent adapting to the past.


I really am not sure why my comments were censored other than they reveal some inconvenient points. If the moderators do not think that adaptation is a better alternative or can show that the costs for the 80% goal are reasonable then why not let the commenters provide those numbers or better yet disprove them with their own post. New York State has never shown their numbers so surely someone somewhere can prove their case for them.

How Much for the Paris Climate Agreement

For those of you who are worried that Trump’s decision to pull out of the Paris Climate Agreement is a bad thing I would ask you to consider two pragmatic questions: How much was US participation going to change future temperature and how much would it have cost?

Bjorn Lomborg used the standard MAGICC climate model to determine how much the future temperature would change due to the United States “Nationally Determined Contribution”. He found that the full US promise for the COP21 climate conference in Paris will reduce temperature rise by 0.031°C.  Mike Hulme posted an estimate for the difference of 0.3°C. For the record, my opinion of the MAGICC model is that it is too sensitive to CO2 so I think the impact would be even less the lower bound. In any event, atmospheric temperature is reported to the nearest whole degree Fahrenheit in the US which is 0.55°C. In other words the temperature change range expected from our Paris commitment is lower than the reporting limit so the change is nothing that could be observed.

So how much was it going to cost? If you bothered to listen to the Trump speech on his decision he talked about the Green Climate Fund. If you have any doubts about the decision look up the briefing note by Climate Focus. I found this quote particularly interesting:

“The Paris Decision, serving as guidance for the implementation of the Paris Agreement and pre-2020 action, ‘strongly urges developed country Parties to scale up their level of financial support, with a concrete roadmap to achieve the goal of jointly providing USD 100 billion annually by 2020 for mitigation and adaptation’ (para 115). The Decision furthermore mentions that prior to 2025 the COP shall set a new ‘collective quantified goal from a floor of USD 100 billion per year’ (para 54). The reason both quantitative targets are missing from the actual Agreement is a pragmatic one – in doing so the COP has enabled the US President to adopt the Agreement as ‘sole-executive agreement’ under US law, without the requirement for the US Senate to approve.”

Think about this paragraph. The Paris agreement wanted to start financial support for mitigation and adaptation at $100 billion per year with plans for going even higher in the future. Clearly all the countries on the receiving end of this largesse are in favor of this agreement and clearly the United States was expected to provide the largest chunk of that money. It is particularly telling that the agreement was crafted without quantitative targets so it could be adopted as a ‘sole-executive agreement’ under US law, without the requirement for the US Senate to approve. If it had included quantitative arguments then US voters would have found out how much money the United States was supposed to dole out when the treaty was debated in the Senate.

In summary we were supposed to pay out billions and billions for an agreement that would not have measurably changed global warming. How is getting out of that a bad thing?