Climate Forecast Lessons from Dorian

Although I am a meteorologist with over 40 years of experience, I have been told that does not qualify me to have an “expert” opinion on the science of climate change.  Nonetheless, I believe my background and experience qualifies me to make a few points about the model-based projections of climate change relative to the forecasts for Hurricane Dorian.  Don’t ever forget that model projections are the basis for the “climate crisis” rhetoric that we are bombarded with on a daily basis.

A quick internet search found this very well done forecast for Dorian on August 29, 2019.  Meteorologist Tim Pandajis from WVEC Channel 13 in Norfolk, VA explains the current status of the storm on August 29, the forecast for the next several days, but also explains many of the reasons why the forecast is uncertain.  I particularly liked his explanation because it includes spaghetti plots.  At 8:04 in the video he shows how different models are seeing things differently and his presentation shows how different models predict how the storm will move and the timing.  Of course as it turned out Dorian behaved quite differently than any of the forecasts.

Given the constant changes to the forecasts for Dorian I am sure many recall the old saying that meteorology is the only profession where you can be wrong most of the time and still keep your job.  Reality is much different.  For me there are two things to keep in mind.  On September 1 the storm reached peak intensity but it also stalled.  The forecast intensity for the rest of the storm only went down when it became obvious that the storm intensity was going down.  The reason the intensity went down is that the hurricane sat in one place for so long that it brought cold water up to the surface.  Hurricanes need warm water to maintain intensity or grow and the cold water affected the intensity.  It is interesting that the models did not incorporate that effect or did not incorporate enough of that effect.  However, I am confident that the models will be revised to address that in the future.

When I graduated with my MS of in meteorology in 1976 three to five-day forecasts were not that good but they have improved a lot.  I ascribe that improvement in large part because weather forecasts are always being tested.  Whenever there is a poor forecast the models and the forecasters learn from that and improve their products going forward.  The climate forecasts that predict imminent and inevitable climate catastrophe do not have that advantage.  The National Weather Service defines 30-year averages as a climatic normal.  Using that time-period a climate model forecast should be tested against a 30-year weather average of observations.  Clearly there are many fewer opportunities to test a climate forecast model as opposed to a weather forecast. In addition, my experience with simpler models is that you can get the “right” answer for the wrong reason.  Weather forecast models address this problem by the large number of tests.  If they adjust the model for the wrong reason it may work once but the error will show up later so a different adjustment is tried until they get it right.  Climate models will never be able to correct if they have the wrong reason in our lifetimes.

The final lesson from Dorian is forecasting uncertainty.  As Tim Pamdajis showed with spaghetti plots in his presentation there was enough uncertainty to make a difference on hurricane response actions to take for the forecasts on August 29.  On the other hand, the climate model projections are portrayed in the media and by advocates as absolutely certain.  None of the caveats provided by the modelers are acknowledged in the hue and cry about a climate emergency.  The reality is that there are a range of modeled projections for future climate and, for the most part, only the most extreme impact results are publicized and those are the ones that are the basis for the “climate emergency”.

These lessons from Dorian support my belief that climate model forecasts cannot be trusted enough to believe that there is a climate emergency.  I am not alone.  Richard Lindzen commented on climate modeling for greenhouse gas effects:

“Here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.  This is an extraordinary pair of claims based on reasoning that borders on magical thinking.”

My takeaway message from Dorian.  Everyone has experience with weather forecast model predictions.  Intuitively I imagine most people have some suspicions about the validity of any predictions of the climate in 100 years.  This post illustrates reasons why those suspicions are well-founded.  In no way does that mean that the climate is not warming or that greenhouse gas emissions might not have an effect in the future.  However, in my opinion the imminent, inevitable climate catastrophe forecast is a very low probability for this and many other reasons.  If you want to do something to reduce potential climate impacts then do the “no regrets” like energy conservation and energy efficiency, and invest in research to make carbon dioxide free energy production cheaper than energy production from fossil sources which would make conversions a no regrets solution.  Unfortunately this is not the message from any of the Democratic candidates for President.

One final point relates to the effect of global warming on the storm itself.  I am sure you have heard the stories that Dorian supports the catastrophic concerns.  I don’t have time to address this in particular but I believe that the following refute the proposition that Dorian is somehow indicative of a global warming crisis.

    • Judith Curry “Alarmism enforcement” on hurricanes and global warming argues that there are a few climate scientists whose behavior “is violating the norms of science and in my opinion is unethical”. She also provides links to two papers from the World Meteorological Organization (WMO) Task Team on Tropical Cyclones that do not support the crisis allegation:

Tropical Cyclones and Climate Change Assessment: Part I. Detection and Attribution

Tropical Cyclones and Climate Change Assessment: Part II. Projected Response to Anthropogenic Warming

Connect New York “Climate Change in New York” Panel Discussion

Updated response from the host September 5, 2019 follows

On August 26,2019 Public Broadcasting Service WCNY Syracuse NY aired the Connect New York program “Climate Change in New York, a Changing Landscape”.   I stopped listening within the first two minutes because there were three gross mis-characterizations in that time and that was too much for me to swallow.  This post documents those three mis-characterizations.

Their description of the show states:

“Summer 2019 has been an illustration of climate change in New York – from a record breaking heat wave to flooding along the shores of Lake Ontario. In July, Governor Cuomo signed one of the most aggressive climate bills in the nation. We ask climate experts if the new law will be enough when the International Panel on Climate Change has warned that the world has 11 years left to act.”

In the opening monologue of the show host Susan Arbetter said: “Summer 2019 has been a graphic illustration of climate change from a record-breaking heat wave in France to flooding along the shores of Lake Ontario.”  After introducing the panel Ms. Arbetter referenced the UN Intergovernmental Panel on Climate Change asking Sandra Steingraber why we have to act quickly.  Dr. Steingraber said “Climate change now is a real emergency” and I stopped watching.  I believe that the heat wave and high water only represent extreme weather within the range of natural variability and that there is no climate emergency.   One of my pragmatic environmentalist’s principles is Alberto Brandolini’s  Baloney Asymmetry Principle: “The amount of energy necessary to refute BS is an order of magnitude bigger than to produce it.”  The explanation of the reason why Lake Ontario flooding is not an illustration of climate change exemplifies that principle.

If climate change were the cause of record Lake Ontario levels and resulting flooding then we would expect that there would be a trend of increasing lake levels.    That presumption is very easy to check. The US Army Corps of Engineers, Detroit Office provides monthly mean lake-wide average levels for all the Great Lakes.  The Great Lakes water levels 1918 to 2018 figure shows these data for all the lakes.  A quick scan does not reveal any obvious trend for Lake Ontario.  Moreover there are high lake levels in 1943, 1947, 1951, 1952, 1973, and 1974 as well has values in 2017 and the record breaking levels in 2019.

There is another factor to keep in mind relative to the Lake Ontario historical water levels.  When the Moses-Saunders dam on the St. Lawrence River was completed in 1958 it enabled some control of Lake Ontario water levels.  The International Lake Ontario – St. Lawrence River Board implemented Plan 2014 to ensure that releases at the Moses-Saunders Dam comply with the International Joint Commission’s 8 December 2016 Supplementary Order effective January 2017 entitled: Regulation Plan 2014 for the Lake Ontario and the St. Lawrence River Compendium Document.  I will not try to determine whether the dam had any effect on the recent high water levels but there are those that believe that is the case.

In order to determine if there is a possible trend I fit a linear regression model to determine if there was a statistically significant trend. I use Statgraphics Centurion software from StatPoint Technologies, Inc. to do my statistical analyses because it provides flexible plotting and regression tools.  Statgraphics enables the user to choose the best relationship from 27 different linear regression equations.  It is also nice because it presents clear summaries for the non-statistician like me.

I found the maximum monthly Lake Ontario water level for each year and plotted those values versus the year.  The Maximum Annual Monthly Lake Ontario Lake Levels 1950 to 2019 figure plots the water levels that have been coordinated with Canada from 1918 to 2018 and 2019 data through July that I extracted from the monthly reports.  According to the statistical program there is a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year because the P-value in the ANOVA table is less than 0.05.  I have listed the statistics and Statgraphics descriptions in Lake Ontario Annual Maximum Water Level Statistics 1950 to 2019.

At first glance host Susan Arbetter appears to be justified saying that Lake Ontario water levels are rising in response to anthropogenic climate change.  Based on their backgrounds I doubt that any members of the expert panel disagreed either. The expert panel consisted of Rachel May, NYS Senator who was an environmental sustainability educator at SUNY ESF with no science degrees; Sandra Steingraber, a Distinguished Scholar in Residence at Ithaca College where she writes about climate change, ecology, and the links between human health and the environment;  Mark Dunlea, founder of the Green Education and Legal Fund whose web page states that he is a graduate of RPI (Management) and Albany Law School; and Yvonne Chu a member of Climate Change Awareness and Action who has a BS in Environmental Science from SUNY Plattsburgh.

However there is an inconvenient fact.  The Intergovernmental Panel on Climate Change claims the effect of anthropogenic greenhouse gas emissions on the climate system “has a 95–100% probability of causing the currently observed and unprecedented warming of the climate since the mid-twentieth century”. As a result anthropogenic climate change could only have affected water level change after 1950. To test this I separated the Lake Ontario water level data into two sets: before and after 1950.  Maximum Annual Monthly Lake Ontario Lake Levels 1918 to 1949 figure lists the water levels from 1918 to 1949. According to the statistical program there is a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year over this time period because the P-value in the ANOVA table is less than 0.05.  I have listed the statistics in Lake Ontario Annual Maximum Water Level Statistics 1918 to 1949.

However, as shown in Maximum Annual Monthly Lake Ontario Lake Levels 1950 to 2019, the relationship is much weaker after 1950.  According to the statistical program there is not a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year over this time period because the P-value in the ANOVA table is greater than 0.05.  I have listed the statistics in Lake Ontario Annual Maximum Water Level Statistics 1950 to 2019.

Because there is no statistically significant trend after 1950, the disastrous flooding of 2019 is more likely weather related than indicative of climate change.  I refer you to another of my pragmatic environmentalist principles the Golden Rule of Climate Extremes.  Dr. Cliff Mass christened this rule as “The more extreme a climate or weather record is, the greater the contribution of natural variability”.  I am confident that were I to do the same kind of analysis for the French heat wave this summer it would be another example of this golden rule.

If you recall, Ms. Arbetter referenced the UN Intergovernmental Panel on Climate Change asking Sandra Steingraber why we have to act quickly.  She said “Climate change now is a real emergency”.  Again I refer you to Dr. Cliff Mass who has explained that climate change is probably not an existential threat.  He believes it is a serious problem and I agree.  Note, however, over-hyping the reality could very well come back and hurt the cause.

Ms.  Arbetter summed up the Lake Ontario flooding as “pitting the status quo against science”.  I have shown that her “science” was fatally flawed.  Her expert panel only included advocates without the technical expertise to differentiate between weather and climate.  Where does that leave the viewers who watched this show?  Eventually the public will catch on that this alleged imminent, inevitable climate emergency that requires costly and sweeping changes to society is not as advertised.

I am heartened that WCNY has not joined the Columbia Journalism Review “Covering Climate Now” effort.  However, this Connect NY program was entirely consistent with the intent of that effort to strengthen the media’s focus on the climate crisis.  According to the Connect NY web page the program offers “insightful discussion, information, and analysis on timely topics that affect residents across the Empire State”.  However, it seems to me the program was not an honest attempt to present both sides of this topic but rather a platform to present opinions of one side of this issue.

Update: I sent a letter to the station with these explanations.  I received the following response on September 5, 2019:

Dear Roger,

I appreciate your email.  The climate program that aired on WCNY in August was the second “Connect: NY” program we have produced on the issue.  The first program aired on February 25th and featured the climate debate from the business perspective.   If you watch both of them, I think you’ll have a fuller appreciation of the range of perspectives we have featured on the air on this issue.

Thank you again for engaging.

warmly,

Susan Arbetter

 

 

Five Reasons Why the Catastrophic Anthropogenic Global Warming Story is Wrong

Scott Adams (of Dilbert comic fame) recently did a video about climate persuasion titled “Scott Adams solves the climate debate and saves the world (really)”, available here on periscope or here on twitter that has sparked folks to distill their arguments for and against climate change. After hearing about this and faced with the constant barrage of media stories about the latest inevitable climate doom story I thought it would be appropriate for me summarize five reasons why I think that the political agenda to transform the energy system of the world is not supported well enough by sufficient scientific evidence to proceed.

The majority of the claims that anthropogenic carbon dioxide emissions are the cause of the observed warming and that we only have a short time to do something or else are based on projections from global climate models (GCM). As noted below I do have relevant experience, education and background to inform my opinion. However, I believe that anyone who does research based on their personal experience, background and education can reach their own informed opinion of these claims if they actively try to get both sides of the story.

First a bit of background of these models. If you want details, Dr. Judith Curry did a detailed overview that includes a summary description. For my purposes all you need to know is that these models are a variation upon the meteorological models that provide predictions that everyone uses when you make decisions based on weather forecasts. There are differences but they use the same physical relationships such as momentum, conservation of heat and conservation of mass.

I have M. S. and a B. S. degrees in meteorology and in my fourth semester of Weather Analysis & Forecasting the laboratory assignment was to break off into teams and write a simple weather forecast model. I have been inside this kind of model but most readers also have some mathematical background that is relevant. In particular you may remember in algebra that if you have three equations you can solve for three unknowns but if you only have two equations you cannot solve them. The problem in meteorological models is that you have more unknowns than you have equations so model developers have to improvise. In particular, instead of using direct relationships for every single factor that affects weather or climate forecasts, meteorologists use parameters to simulate the effects of some atmospheric processes.

The first reason that I am skeptical of any GCM results is the use of parameters which can be thought of as “fudge factors’.   Model developers necessarily have to account for some things that cannot be modeled directly. John von Neumann allegedly summed up the problem stating that “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”[1]. In other words, he could develop a mathematical model that described an elephant simply by fudging the parameters. Everyone who makes a decision based on a weather forecast has learned that you can trust a forecast for tomorrow better than one several days away. In the 43 years since I graduated the forecasts have become more reliable for dates further in the future because weather forecasters have constant feedback and have been able to adjust the parameters in the meteorological models to improve forecasts based on observations. There is only one global climate system and forecasts made today for 100 years away cannot be checked until 100 years have passed. One insurmountable problem is that the parameters and their use in GCMs cannot be verified as correct in our lifetimes.

Another issue with the parameters is that the focus is on just one of these parameters. Richard Lindzen commented on this:

“Here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.  This is an extraordinary pair of claims based on reasoning that borders on magical thinking.”

My final difficulty with parameters in GCMs is that they are used to model clouds. Dr. Curry explains that in order to solve the physical equations in a global climate the world has to be divided up into a three-dimensional grid. The equations are calculated for each grid cell and repeated to generate a forecast. My particular problem is that the grid cell size needed in order to do these calculations are on the order of 100 km horizontally, the vertical height is often 1 km and they do the calculations every 30 minutes or so. As a result, the models cannot simulate clouds. That single parametrization is a big enough driver of climate that this model component alone could dominate the GCM projections. This uncertainty is well understood in climate science by those who have worked with these models. However, the problems with parameterization is not well understood and its ramifications on the policy decisions is poorly understood by most of those who advocate eliminating fossil fuel use.

My second reason for not trusting these models is related to my experience in air pollution meteorology and work I did as a consultant to EPA evaluating the performance of air quality dispersion models. Complexity-wise those models are orders of magnitude simpler than climate models and they are simple enough to be directly verified. All air quality dispersion models are based on results from field studies that injected tracers into stack effluents, measuring the downwind concentrations in a test array under a wide array of meteorological conditions and then developing coefficients for pollution dispersion. I worked on a project where we compared the known emissions and observed ambient concentrations to model projections at power plants. The results showed that the models in use at the time adequately predicted maximum concentrations so EPA was comfortable that they were working correctly. The frightening result in my mind is that it was not uncommon for the model to predict a maximum concentration close to the maximum observed value but the meteorological conditions for the predicted maximum would be different than the meteorological conditions for the observed maximum.

Consider the differences between the GCMs and air quality models. Air quality models use parameters that are based directly on observations, incorporate known emissions, and have been extensively verified in field studies but those models could get the right answers for the wrong reasons. GCMs use parameters that are based on model developer opinions, have to estimate future emissions, and cannot be tested in the climate system. We are supposed to believe the models are getting the right answers for right reasons. I don’t think that is a reasonable assumption for modeling that is the basis for converting the entire economy away from fossil fuels.

My third reason for not accepting the commonly held belief that catastrophe is inevitable is that the projections that make that claim are only one answer in a wide range of potential outcomes. Consider for example that the Intergovernmental Panel on Climate Change (IPCC) does not give a single value for the sensitivity of atmospheric temperature to carbon dioxide. Instead they give a range of potential warming for a doubling of atmospheric concentrations of carbon dioxide of between 1.5 and 4.5 degrees C. Nic Lewis does a nice job discussion climate sensitivity here.  The GCM projections cover a wide range of potential outcomes from benign to catastrophic. Without going too deep I want to point out that the damage claims for increased carbon dioxide depend on the shape of the distribution of this sensitivity. Damage affects society when costs are greater and those estimates are strongly affected by the probability of extreme damages. The first problematic aspect of this issue is that the projections for high impacts rely on a relatively high probability of extreme impacts. Although recent research has shown that the likelihood of extreme outcomes is lower than previously thought, those results have not been incorporated into the damage estimates. If they were considered, then the costs currently claimed would be reduced.

More problematic to me than the range of possible model outcomes is the use of the worst-case representative concentration pathway as the business as usual scenario. The IPCC developed a set of four pathways to concentrations representing the range of radiative forcing (i.e. greenhouse effect) in the literature for 2100. The range of possible future atmospheric forcing levels runs from relatively low levels of carbon dioxide to the highest forcing they thought possible. The problem is that these concentrations had to be related back to emission scenarios and the worst case representative concentration pathway with a forcing of 8.5 watts per meter squared is so high that the emission scenario necessary to get that level is not credible. Many of the really scary projections that get the headlines and dominate the narrative why we need to reduce carbon dioxide emissions use RCP 8.5 as business as usual. We already know that the likelihood of that future emission scenario is extremely unlikely. When coupled with models that give a range of outcomes from benign to problematic, I can only conclude that while it is not impossible that there could a catastrophic impact the probability is so low that it should not drive policy decisions.

My fourth reason for not trusting the models that claim carbon dioxide is the primary driver of the recently observed warming is that even for the limited results we have from the models that can be compared to climatic system they don’t do well. Dr. Curry has explained that inconsistency well:

Between 1910 and 1940, the planet warmed during a climatic episode that resembles our own, down to the degree. The warming can’t be blamed on industry, she argues, because back then, most of the carbon-dioxide emissions from burning fossil fuels were small. In fact, Curry says, “almost half of the warming observed in the twentieth century came about in the first half of the century, before carbon-dioxide emissions became large.” Natural factors thus had to be the cause. None of the climate models used by scientists now working for the United Nations can explain this older trend. Nor can these models explain why the climate suddenly cooled between 1950 and 1970, giving rise to widespread warnings about the onset of a new ice age.

The final reason that I believe that the political agenda to transform the energy system of the world is not supported by sufficient evidence to be credible is the suggestion that climate change is easy to solve because renewables are a viable solution. As convinced as I am that the climate science does not support this agenda I believe the suggestion that wind and solar can solve our energy problems is even more of an exaggeration. In fact that issue is the primary driver why I blog and have written so much about New York’s climate change plans. However, don’t listen to me, listen to Bill Gates who states “The idea that we have the current tools and it’s just because these utility people are evil people and if we could just beat on them and put (solar panels) on our rooftop—that is more of a block than climate denial,” Gates said. “The ‘climate is easy to solve’ group is our biggest problem.” Another problem is that the renewable “solution” very likely has very significant environmental impacts that are generally ignored. Michael Shellenberger had a Ted talk “Why renewables can’t save the planet” that addresses this issue.

To sum up. The rationale used to justify the need to convert the energy system of the world is that carbon dioxide will cause inevitable catastrophe and we can be saved if only we implement renewable wind and solar which will be easy to do. I don’t believe the science supports inevitable catastrophe because those projections are based on global climate models. Those models use too many fudge factors that can give too many results that can never be tested, much simpler models that are based entirely on observations can give the right answer for the wrong reason and the model results to date do not adequately predict the one climate experiment we can test. Most of the catastrophic outcomes that dominate the political and media narrative depend on an emissions scenario that is not credible. I do not believe that diffuse and intermittent solar and wind can be used to replace reliable and affordable electric power much less generate enough energy to convert transportation, heating, and industrial use of fossil fuel to electricity.

[1] Attributed to von Neumann by Enrico Fermi, as quoted by Freeman Dyson in “A meeting with Enrico Fermi” in Nature 427 (22 January 2004) p. 297

 

Recommended Read: Global Warming for the Two Cultures

I have the education, background and experience to independently evaluate the constant drum beat claiming imminent and inevitable climate catastrophe if we don’t immediately reduce our carbon footprint. I am a luke-warmer who believes that the sensitivity of climate to anthropogenic carbon dioxide emissions is at the bottom of the Intergovernmental Panel on Climate Change range. At that level, climate catastrophe is a very unlikely possibility and the effect is much more likely to be benign.

Unfortunately it is very frustrating to hold my position because the media, politicians and advocacy groups have convinced many that we have to use renewables as a “solution” to what I think is a non-existent problem. As a result I am always looking for a good summary of the issues that I have with the imminent climate catastrophe narrative. The 2018 Global Warming Policy Foundation Annual Lecture: “Global Warming for the Two Cultures” by Dr. Richard Lindzen is an excellent summary that I recommend to those who believe that we need to transform the energy system to do “something” about climate change so that they will have at least heard the other side of the story.

Lindzen begins his talk by describing two cultures in society and the implication of that on policy decisions. Basically the two cultures are those that understand the “science” in general and physics in particular and those that don’t. He explains why this understanding gap is a problem:

While some might maintain that ignorance of physics does not impact political ability, it most certainly impacts the ability of non-scientific politicians to deal with nominally science-based issues. The gap in understanding is also an invitation to malicious exploitation. Given the democratic necessity for non-scientists to take positions on scientific problems, belief and faith inevitably replace understanding, though trivially oversimplified false narratives serve to reassure the non-scientists that they are not totally without scientific ‘understanding.’ The issue of global warming offers numerous examples of all of this.

One of my problems with the media climate change story is that the greenhouse effect is simple. His lecture describes the complicated climate system in enough detail to support my contention that the inevitable climate catastrophe is imminent story is an over-exaggeration.

I particularly like his description of the popular narrative we hear from the media and politicians:

Now here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.

This is an extraordinary pair of claims based on reasoning that borders on magical thinking. It is, however, the narrative that has been widely accepted, even among many sceptics.

He then goes on to describe how he believes the popular narrative originated and de-bunks the evidence we constantly reminded supports the catastrophic narrative.

I encourage you to read the entire lecture. I believe it supports his concluding summary of the situation:

An implausible conjecture backed by false evidence and repeated incessantly has become politically correct ‘knowledge,’ and is used to promote the overturn of industrial civilization.

Temperature Related Deaths

Environmental advocates claim heat results in more deaths than any other weather-related event but I recently read a conflicting claim about weather-related deaths. The New York City Environmental Justice Alliance released a new report, NYC Climate Justice Agenda 2018 – Midway to 2030: Building Resiliency and Equity for a Just Transition, that claims “Extreme heat results in more deaths than any other weather-related event”. On the other hand, a study in Lancet, “Mortality risk attributable to high and low ambient temperature: a multicountry observational study”, notes that “most of the temperature-related mortality burden was attributable to the contribution of cold”. I did some research and now I think I know what is going on for these two differing claims.

The NYC Climate Justice Agenda bases their claim that extreme heat causes more deaths than cold based on an EPA reference. The EPA extreme heat webpage uses data from the National Oceanic and Atmospheric Administration Natural Hazard Statistics: Weather Fatalities website. The home page for that site lists 508 fatalities from all weather events in 2017, including 107 from extreme heat, 26 from extreme cold, 10 from winter storms, 1 from ice, and 3 from avalanches. Those data that show that the more people died due to extreme heat than other cause, narrowly beating out flash floods, and that more people die from heat than cold-related events. The data for the website are compiled from information in the National Weather Service (NWS) storm events database.

The Global Warming Policy Foundation April 9 2018 newsletter reported that 48,000 Britons died this winter due to cold weather.   Those numbers are obviously far different than the NWS data. The Lancet paper by Gasparrini et al. notes that:

Although consensus exists among researchers that both extremely cold and extremely hot temperatures affect health, their relative importance is a matter of current debate and other details of the association remain unexplored. For example, little is known about the optimum temperatures that correspond to minimum effects for various health outcomes. Furthermore, most research has focused on extreme events and no studies have comparatively assessed the contribution of moderately high and low temperatures. The underlying physiopathological mechanisms that link exposure to non-optimum temperature and mortality risk have not been completely elucidated. Heat stroke on hot days and hypothermia on cold days only account for small proportions of excess deaths. High and low temperatures have been associated with increased risk for a wide range of cardiovascular, respiratory, and other causes, suggesting the existence of multiple biological pathways.

I believe that the reason for the difference in the two conclusions is explained by this statement by Gasparrini et al.: “The dose-response association, which is inherently non-linear, is also characterised by different lag periods for heat and cold—i.e., excess risk caused by heat is typically immediate and occurs within a few days, while the effects of cold have been reported to last up to 3 or 4 weeks.”

According to the NWS instructions for storm data preparation the storm data report documents:

  • The occurrence of storms and other significant weather phenomena having sufficient intensity to cause loss of life, injuries, significant property damage, and/or disruption to commerce;
  • Rare, unusual, weather phenomena that generate media attention, such as snow flurries in South Florida or the San Diego coastal area; and
  • Other significant meteorological events, such as record maximum or minimum temperatures or precipitation that occur in connection with another event.

The key point is that the storm data report makes a distinction between direct and indirect deaths.  Only direct deaths are tabulated when a local weather office prepares the storm report. For example, in winter storms deaths from heart attacks from shoveling snow are indirect.  If a person wanders outside and freezes to death that’s a direct death. Furthermore, while indirect deaths are included in the storm narratives the numbers are not included in the tabulated data and storm reports are prepared within days of the event so any indirect deaths due to excessive cold caused by weeks-old impacts would not be included. Details on the difference between direct and indirect deaths are found in the instruction document on pages 9 to 12.

In their study of Gasparrini et al. found that temperature is responsible for 7.7% of mortality. Cold was responsible for “most of the burden”. Although in the study over 90% was attributed to cold the paper noted that “This difference was mainly caused by the high minimum-mortality percentile, with most of the mean daily temperatures being lower than the optimum value”. I interpret that to mean that some of the difference was due their classification methodology. In line with the indirect death distinction it is notable that over 85% of the mortality attributable to temperature was related to moderate cold. Offhand I think there must be more causes of death associated with freezing weather than hot weather. For example, auto accidents on icy roads has to cause more deaths than any hot weather impact on travel.

In conclusion, there is a data base that does show that extreme heat results in more deaths than any other weather-related event. However, the database used to justify that claim only includes direct deaths. An epidemiological study that does include indirect deaths concludes the majority of deaths are associated with moderate cold weather.

Relative to climate change policy the distinction between heat and cold is important. If the argument is that we must mitigate human impacts on climate to reduce mortality due to temperature than because a warming climate will result in less moderate cold then that means warming will have a beneficial effect. An unintended consequence of climate change mitigation through the implementation of renewable energy is the universal increase in cost. Given the impacts on indirect deaths I believe that increased heating cost will adversely affect mortality if low income people cannot afford to keep their homes warm enough to prevent potential health impacts of cold weather. Finally, the fact is that climate is a reason many more people move to Phoenix AR than move to the “ice box of the nation”, International Falls, MN, suggests we are better able to adapt to warm than cold.

Climate change soon to be main cause of heat waves in West, Great Lakes

A recent study entitled Early emergence of anthropogenically forced heat waves in the western United States and Great Lakes was publicized in the Syracuse New York Post Standard under the headline Upstate NY among first to have most heat waves due to climate change. Unfortunately, as Blair King writes “it actually represents a quite excellent example of how science is misrepresented to the public in the climate change debate.”

According to a press release: “Lopez and colleagues used climate models along with historical climate data to project future heat wave patterns. They based their findings on the projection for greenhouse gas emissions this century, known as the RCP8.5 scenario. This assumes high population with modest rates of technological change and energy conservation improvements and is often called the “business as usual” scenario. Lopez said he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.”

My concern and that of Blair King is the use of the RCP8.5 scenario. This is a representative concentration pathway that represents a forcing of 8.5 watts per meter squared that is used by climate modelers to represent the worst case atmospheric effect of greenhouse gases by 2100. Essentially this emissions scenario was developed to provide that forcing level.

Larry Kummer looked at the scenario in detail. He notes that “It assumes the fastest population growth (a doubling of Earth’s population to 12 billion), the lowest rate of technology development, slow GDP growth, a massive increase in world poverty, plus high energy use and emissions.” His post explains that RP8.5 assumes population growth at the high end of the current UN forecasts, assumes that the centuries long progress of technology will slow, and assumes no decarbonization of world power sources from new technology (e.g., solar, wind, fission, fusion) or regulations to reduce not just climate change but also air pollution and toxic waste.

Blair King explains that RCP8.5 has a storyline that describes the assumptions of the scenario in easy to understand language. He goes on to explain that the RCP8.5 scenario dates back to 2007 and is characterized by the following:

  • Lower trade flows, relatively slow capital stock turnover, and slower technological change;
  • Less international cooperation than the A1 or B1 worlds. People, ideas, and capital are less mobile so that technology diffuses more slowly than in the other scenario families;
  • International disparities in productivity, and hence income per capita, are largely maintained or increased in absolute terms;
  • Development of renewable energy technologies are delayed and are not shared widely between trade blocs;
  • Delayed land use improvements for agriculture resulting in increased pollution and increased negative land use emissions until very late in the scenario (close to 2100);
  • A rebound in human population demographics resulting in human population of 15 billion in 2100; and
  • A 10 fold increase in the use of coal as a power source and a move away from natural gas as an energy source.

Consider those assumptions against what actually has happened since 2007. I am not sure about the status of international disparities in productivity and land use improvements. However, I believe all the other parameters are not following those assumptions. Global trade is at all time highs, renewable technology is freely traded, renewable technology continues to mature and develop, and there is no sign of human population growth accelerating to reach 15 billion. Most importantly, this scenario pre-dates the fracking revolution that has flipped the use of coal and natural gas in the United States by making natural gas so cheap and plentiful. There is no reason to believe that the technology won’t expand elsewhere and markedly reduce any potential increase in the use of coal as a power source.

Lopez states that “he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.” He is either ignorant of the substantial change in greenhouse gas emissions observed in the United States or willfully ignored those numbers to misrepresent the science to the public.

Smithsonian Capture the Sun Harness the Wind

I am so tired of the Smithsonian’s unquestioning devotion of renewable energy in spite of obvious warning signs that I wrote a letter to the editor. In the April 2018 Smithsonian there is an article entitled “The Future’s so Bright (He’s Gotta Wear Shades)” by Dan Solomon and a related graphic article “Capture the Sun, Harness the Wind” by 5W Infographics. These articles are essentially puff publicity pieces for the renewable energy industry that clearly shows the bias in Smithsonian on renewable energy. Nonetheless they cannot escape inconvenient facts.

The most obvious problem with Solomon’s article on the bright future of renewable energy in Georgetown Texas is that renewable energy looks great for the early adopters but the reality of a 100% reliable electric system lies beneath that success. The situation is exactly the same as a pyramid scheme where the first ones in reap benefits. When Solomon’s article notes that “about 2% of the time the Georgetown utility draws electricity derived from fossil fuels”, an unbiased article would have followed up on the implications of that. The primary support for the fossil fuels necessary to keep Georgetown lights on comes from everybody else. As more rent-seekers pile onto the renewable energy bandwagon pyramid the costs necessarily increase for those on the outside. As you dig deeper it becomes apparent that price support for the rest of the electric system not only becomes more likely but because solar and wind don’t support grid services it becomes increasingly likely that another layer of support has to be added at some point over 30% renewable penetration. At this time Georgetown is not paying for that.

The first graph in the graphic article shows “The comparison to coal” which charts the 2016 actual coal and renewable sources electricity generation and projects changes in their use out to 2050. Comparing the 2016 coal use of 1,216 billion kwhr and 573 billion kwhr renewable source estimate of 573 with EIA numbers shows that those numbers are close enough to not quibble. However, the title of the article refers to sun and wind and the electricity generation in those categories is lumped together with hydro, biomass, geothermal, and other gases. As far as I can tell solar and wind account for less than half of the 573 billion kwhr number. On the other hand most of the future renewable growth will occur in the wind and solar sectors but the graphic does not provide that information. Neither article mentions just how much wind and solar generation will be needed to meet the projected 2050 number.

Another graphic notes that 800 MW of energy storage were built in the United States in the last five years and expects that amount will be built in just 2020. The important number is how many MW hours will be available from the energy storage built because that defines how much storage will be available to counteract renewable’s intermittency. Solomon’s article also did not address how much storage would be needed for Georgetown to get off the grid. Neglecting to point out that because intermittent renewables struggle to generate power over a third of the time we will likely have to over-build renewable capacity and add massive amounts of energy storage biases the renewable argument.

One of the inconvenient facts illustrated but not noted in the graphic article is jobs per energy produced. If you divide the number of coal-industry employees in 2016 into the total coal generation you get 24.3 million kWh produced per employee. If you divide the sum of the solar and wind employees in 2016 into half of the reported renewable sources generation you get 0.8 million kWh produced per employee. Coal is 30 times more man-power efficient. While that may be good for employment it does not portend well for cost.

Other than the fact that the duck curve is graphically interesting I am not sure why that was included in the graphics article. More importantly it illustrates a problem. When you have large amounts of solar on the system something has to be available to make up for the evening demand. That is where storage becomes necessary. In order to keep the lights on you also need enough storage to cover those days when there isn’t any sun. Dale Ross’s flippant we are in West Texas so “Cloudy, Really?” comment aside a quick check of the climatological data indicates that it is mostly cloudy 28% of the time in Georgetown. Obviously despite the claim that Georgetown is powered entirely by renewable energy the fact is that is not true.

The Solomon article has multiple instances of conveniently neglected facts to make the story. It notes that the City was able to get guaranteed rates of 20 years for wind power and 25 years for solar power. It would have been appropriate to note that these types of facilities have very little operational experience that long so the guarantees might not be as iron-clad as implied. Solomon quotes Adam Schultz as saying that solar and wind have gotten so much cheaper that “I can’t even tell you the costs because costs have been dropping so rapidly”. If that is the case then why do both solar and wind need direct subsidies? Finally, blowing off the environmental impact of renewables on birds by saying that more birds are killed by cats and buildings reminds me of the two wrongs don’t make a right parable. Furthermore, what about the bats and just how many raptors are killed by house cats? The fact is that because renewable energy is diffuse, wildlife issues are a legitimate concern.

Those are the superficial errors illustrating biases. The reality is that because wind and solar are diffuse the electric grid is essential for keeping the lights on. Digging down into this problem is more complicated but necessary for the complete unbiased story of renewables. I recommend this post by Planning Engineer at the Climate Etc. blog for an overview of the transmission planning difficulties raised by wind and solar energy. In brief, the modern power grid is a connected complex machine that balances various electrical and mechanical properties over thousands of miles. The system must be stable that is to say stay in synchronism, balance loads and generation and maintain voltages following system disturbances. The grid is built upon heavy rotating machinery at hydro, nuclear, and fossil-fired generating stations that provides that stability. Wind and solar power do not provide any stability support. Up to some point the present day grid is resilient enough to overcome that problem but at some point it has to be addressed. I don’t doubt that it can be addressed successfully but the costs necessary to do that are unknown and were certainly not a consideration in either article.

The reality of solar and wind renewable power not addressed in this article is that it is likely only to completely supplant fossil fuels in limited locations where both solar and wind potential are high, industrial load requirements are negligible, and the weather is mild in the winter because both resources are intermittent and diffuse. Texas has large wind and solar resources because of geography and because it is so large there is enough space to generate significant amounts. Georgetown TX does not have heavy industry that requires high amounts of electricity constantly so they can pretend to be powered entirely be renewable energy. Finally, Georgetown does not have to contend with winter impacts of higher latitudes particularly home heating. The solar resource is much reduced simply because the days are shorter but you must also consider reductions due to snow covered rooftop solar cells.