Frustrations of a Meteorologist in Today’s Times

An article came to my attention today that epitomizes my frustration with everyone assuming that all extreme weather events are associated with climate change.  I have been meaning to vent on this issue so here I go.

I have two degrees in meteorology, am a retired certified consulting meteorologist accredited by the American Meteorology Society, and have over 40 years experience as a practicing meteorologist.  The opinions expressed in this post do not reflect the position of any of my previous employers or any other company I have been associated with, these comments are mine alone.

The article that piqued my interest was titled: “Con Edison to install 17 weather stations across New York; largest tower slated for Staten Island”.   The quote that wound me up was the following:

“Climate change makes smart infrastructure planning and design essential,” said Charles Viemeister, Con Edison’s project manager. “We’ll use data from the Micronet to gain additional insight into the local short-term and longer-term impacts of climate change. We are always looking for technologies that can help us maintain the resilient, reliable service our customers need.”

My first issue is the implicit inference in this quote and elsewhere in the article that the primary value of these meteorological stations has something to do with climate change when in reality the value is for evaluation of weather events.  Weather is not climate!  One way to think of it is: Climate is what you expect, weather is what you get.

The reality is that adding 17 weather stations to the 126 stations in the NYS Mesonet system and providing that data to the public will be used to address the weather we get today.  It will strengthen the ability of meteorologists to provide real-time analyses and short-term forecasts of extreme weather events that can cause power outages.  Con Edison will be able to provide better responses with this finer-scale resolution information.  This is a good thing and I applaud the project.

On the other hand, these data are not suitable for climate trend analyses to determine what we can expect.  In order to assess climatic trends, the meteorological data collected must be from a representative location.  By that I mean it cannot be affected by anything local that could change the trend of temperature, winds or precipitation measurements.  Frankly, that is always difficult to do and in New York City nearly impossible to do well enough to be able to tease out the climate signal. For example, an ideal location for measuring temperature trends would be in a field surrounded by at least 100 feet of mown grass.  As long as the grass does not become overgrown with shrubs and trees, planted with different crops or, worst of all, paved over for a parking lot then changes to the measured temperatures over time are the result of a climate signal.  Of course, in the city keeping everything that can affect the measurements constant is much more difficult.

This story opens a scab of mine related to the constant conflation of any extreme weather event with climate change.  In the headlines this week are the wildfires in California and Oregon.  California Governor Newsom vows to face climate change head on fighting the wildfires.  CNN claims that the warming climate is going to make things worse.  Of course in this politically charged year others claim  climate change is not the primary factor and argue for other causes.  As a meteorologist I can only argue with any kind of authority about the climate data.  The satellite observations show a decreasing trend in global wildfires and the data show high temperatures in the past too.  Ultimately, wildfires have always been a problem in California.  Finally another meteorologist looked at what caused the fires in Oregon and Washington and concluded that climate change was not a factor.  I expect he would have made the same conclusion if he looked at the California situation.  In my experience, every time (here, here, and here for example) I have looked at some weather event that is claimed to be related to climate change I have been unable to find any real evidence supporting the claim and plenty of evidence to argue otherwise.

The constant refrain that every extreme weather event is “proof” that climate change is happening now bothers me because the claims are used to justify the need to change the energy system.  In fact, were it not for the climate emergency do we really need to change the energy system? Worse is the fact that the transition to a green economy diverts resources better spent to adapt and strengthen infrastructure for extreme weather observed in the past. For example, if a storm exactly like tropical storm Sandy were to occur again would we be able to weather the storm with minimal impacts?  If not then we are doing something wrong.

Let Experience be your Guide to Climate Science

In this post I explain why I think that your direct experience should guide your opinion on global warming climate science.  You may not be a climate scientist but your personal experiences enable you to judge the certainty of the climate claims popularly heard.

Update November 1, 2019: Added a link at the end to a post about the reliability of extended forecasts

Greenhouse Effect

The reason that we hear that there is an inevitable, imminent climate emergency is because of the greenhouse effect.  But how do we observe it in the atmosphere?  All things being equal, if you know whether it is warmer or colder in the morning after a clear night then you understand the impact of the greenhouse effect.  Of course, the answer is it is colder after a clear night.  Simply put, when something, in this case clouds, reduces the amount of heat loss (long wave radiation) from the surface and atmosphere, then the temperature does not cool as much, so it is colder after a clear night than a cloudy night.

There are a couple of ramifications of what you already know about this greenhouse effect fact.  On clear nights cooling can occur at about 3.4 deg F an hour while on an overcast night cooling is only about 0.5 deg F per hour.  Global average temperature was on the order of 2.5 deg F warmer in 2017 than in 1850.  If all the warming since 1850 was due to greenhouse gases, then that warming is less than one hour of a cloudy night as opposed to a clear night.  Therefore, clouds have a much stronger effect on temperature than greenhouse gases.   The other point is that the greenhouse gas effect is stronger at night than during the day so nights are warming faster than days.  Keep this in mind when you hear that climate change is going to cause much hotter day time temperatures.  The reality is that the average is going up more because the minimum temperature is going up rather than because the maximum temperature is going up.

Forecast Skill

Predictions of a climate emergency are based on climate prediction models.  Remember weather is what we feel over short periods and climate is how the atmosphere acts over longer periods of time, i.e., decades.  Observant weather-wise people understand the uncertainty of forecasts for different time periods.  Obviously, a 24-hour forecast is more reliable than a seven-day forecast.  You know that longer term weather forecasts are not as reliable because you have observed that.  The fact is that the physical relationships for forecasting weather and climate are the same.  There are differences but the inescapable conclusion is that climate forecasts for one hundred years from now are much less reliable than weather forecasts.

Although people like to say that the weather forecasting profession is the only one that lets you be wrong much of the time and still have a job, the reality is that weather forecasts have improved markedly over time.  When I graduated in 1976 with a master of science degree in meteorology, three to five-day forecasts were much less accurate than they are today.  In no small part that is because weather forecasters are constantly verifying their predictions against observations.  If the forecast is radically wrong then the data are re-evaluated and the modeling parameters are reviewed.  Testing a new modeling variation with the data from the period when the old model forecast failed to test improvements and then implementing the revised modeled is a constant process.  Obviously, a 100-year climate forecast cannot be tested the same way.  It is just not possible to improve climate models much because they cannot be tested frequently enough to make a lot of improvements.

Clouds

Another aspect of forecasting that observant folks understand is the effect of clouds on forecast reliability and usefulness.  Consider the uncertainty when the forecast is for scattered showers.  You know that you may get rain or just as likely may not and if your outdoor activity depends on dry weather that means a lot.  For numerous reasons it is not possible under many conditions to predict exactly when and where a shower may pop up.  The primary reason is that cloud formation is a process that takes place over a small spatial-scale – yards instead of miles.  Weather forecast models can incorporate the factors that cause clouds and precipitation into the predictions but not the small-scale factors that cause them at a specific location and time.  Residents of Upstate New York are very familiar with the forecast that lake-effect snow is going to occur “north of the Thruway”.  Even though forecasters run finer-scale models that are limited to areas immediately adjacent to the Great Lakes, they still can only predict that somewhere in that area there will be a snow band but not exactly where.

There are very serious implications of clouds on the climate forecasting models.  Because climate models have to predict over the entire globe, none of the physical processes that create clouds are incorporated into the models.  Instead the models simulate clouds by parameters which, to be kind, is simply the expert opinion of the model developer.  Don’t believe me?  Here is what Nakamura Mototaka says in Confessions of a climate scientist:

“Clouds are represented with parametric methods in climate models. Are those methods reasonably accurate? No. If one seriously studies properties of clouds and processes involved in cloud formation and dissipation, and compare them with the cloud treatment in climate models, one would most likely be flabbergasted by the perfunctory treatment of clouds in the models. The parametric representations of clouds are ad hoc and are tuned to produce the average cloud cover that somewhat resembles that seen in the current climate. Can we, or should we, expect them to simulate the cloud coverage and properties in the “doubled atmospheric carbon dioxide” scenario with reasonable accuracy? No.”

Implications

I have described three aspects of global warming climate science that observant folks basically understand based on their personal experience.  We know that clouds cause great differences in temperatures.  Clearly weather forecast models that can be tested are more reliable than climate prediction models that cannot be tested for the relevant forecast period.  Even though weather forecast models have improved we know that they still don’t do as well as we would like for clouds and precipitation.

This all leads to the implication of the fact that the climate models do not do a credible job with clouds.  We all know that clouds have a big effect on the temperatures we observe.  If the climate models that cannot be tested do not simulate clouds correctly, why should we have much faith in the projections of inevitable, imminent climate emergency from those climate models?

I believe we should consider the results of climate models the same way we treat a forecast for a slight chance of scattered showers.  Based on our experiences we know that there are a range of potential outcomes for that forecast.  Clearly, those who claim that there is an inevitable, imminent climate catastrophe are stretching credibility.  While nothing here can lead to the conclusion that a catastrophic outcome is impossible, the uncertainty surely dictates that our response be carefully crafted. While it might seem prudent to act we must not forget  Ridley’s ParadoxEconomic damage from man-made ‘climate change’ is illusory whereas damage from man-made ‘policies’ to fight the said change is real.  Moreover, there is the potential that the current focus on a climate emergency is diverting attention that might be better spent on higher probability issues such as: global pandemics, antibiotic resistance, Carrington events, or, if you worried about truly existential threats with low probabilities, asteroid impacts.

November 1, 2019 Update  This post by Dr. Cliff Mass provides good background to our experience that extended forecasts are not reliable.

Status of Climate Change Science October 2019

Several recent blog posts have come to my attention that I want to pass on to readers of this blog because all three make good points and, ultimately justify a pragmatic approach in my opinion.  I have summarized them below but recommend that you read them all in their entirety.

Judith Curry argues that the science does not support the claims that climate change in an existential threat.  I believe it is safe to say that Cliff Mass is more worried about the threats of climate change but makes the point that there is an active group in the climate debate, “mainly on the political left, that is highly partisan, anxious and often despairing, self-righteous, big on blame and social justice, and willing to attack those that disagree with them” that he believes may in the end do more harm than good.  Finally, Larry Kummer offers suggestions that could be implemented today with widespread support from most of society.

Judith Curry writing on her Climate Etc blog posted her response to a reporter’s questions about the current state of climate limits and timelines.  The reporter asked about the deadlines (e.g., the 12 years to act) currently in the news. She concluded:

Bottom line is that these timelines are meaningless.  While we have confidence in the sign of the temperature change, we have no idea what its magnitude will turn out to be.  Apart from uncertainties in emissions and the Earth’s carbon cycle, we are still facing a factor of 3 or more uncertainty in the sensitivity of the Earth’s climate to CO2, and we have no idea how natural climate variability (solar, volcanoes, ocean oscillations) will play out in the 21st century.  And even if we did have significant confidence in the amount of global warming, we still don’t have much of a handle on how this will change extreme weather events.  With regards to species and ecosystems, land use and exploitation is a far bigger issue.

Cleaner sources of energy have several different threads of justification, but thinking that sending CO2 emissions to zero by 2050 or whenever is going to improve the weather and the environment by 2100 is a pipe dream.  If such reductions come at the expense of economic development, then vulnerability to extreme weather events will increase.

There is a reason that the so-called climate change problem has been referred to as a ‘wicked mess.’

Cliff Mass has his own blog on weather and climate.  He recently posted on the Real Climate Debate.  The point of his post was that there are two groups of people active in the climate change debate covered by media and politicians.  He defines the two groups as the ACT group (Apolitical/Confident/Technical) and the the ASP group (Anxious, Social-Justice, Partisan).  The ACT group thinks that global warming is a technical problem with technical solutions while the ASP group see that social change is necessary to deal with global warming and that will require re-organizing society.  His bottom line:

Progress on climate change is being undermined by the efforts of the highly vocal, partisan, and ineffective ASP group.  They are standing in the way of bipartisan action on climate change, efforts to fix our forests, and the use of essential technologies.   They are a big part of the problem, not the solution.

In contrast to the ASP folks, the ACT group generally tries to stay out of the public eye, quietly completing the work  needed to develop the technologies and infrastructure that will allow us to mitigate and adapt to climate change.  In the end, they will save us.  That is, if the ASP folks don’t get in their way.

Larry Kummer writing at the Fabius Maximus blog recommended issues that he hopes a presidential candidate can adopt that will address serious threats. One of the issues he included was Climate Change.  The only disagreement I have with his recommendations concerns conversion to non-carbon-based energy. I think this needs to be included but would prefer that the emphasis be on R&D to find alternatives that are cheaper than fossil fuels.  Until that happens I believe that Roger Pielke Jr.’s Iron Law of Climate Policy will make implementation impossible.  His “iron law” simply states that “while people are often willing to pay some price for achieving environmental objectives, that willingness has its limits”.  Larry’s recommendations are:

   “We don’t even plan for the past.”
— Steven Mosher (member of Berkeley Earthbio here), a comment posted at Climate Etc.

We are locked into two camps, with a large confused mass between the climate extremists and those who deny that global warming is a threat. The resulting gridlock leaves us vulnerable to the inevitable repeat of past extreme weather and the effects of the continuation of the two centuries of warming (from a combination of natural and anthropogenic factors). We can continue to do almost nothing, waiting for one side to stampede the American public into acquiescence – or for the weather to decide for us. Or we can immediately take smaller but still effectual steps. I gave these recommendations six years, and they remain sound today. They could command popular support.

        1. Increased government funding for climate sciences. Many key aspects (e.g., global temperature data collection and analysis) are grossly underfunded. But this research should be run with tighter standards (e.g., posting of data and methods, review by unaffiliated experts), just as we do for biomedical research – and for the same reason, to increase its reliability.
        2. Fund a review of the climate forecasting models by a multidisciplinary team of relevant experts who have not been central players in this debate. Include a broader pool than those who have dominated the field, such as geologists, chemists, statisticians and software engineers. This should include a back-test of the climate models used in the first four Assessment Reports of the IPCC (i.e., run them with forcing data through now, and compare their predictions with actual weather). This will tell us much (details here).
        3. We should begin a well-funded conversion in fifty years to mostly non-carbon-based energy sources. We need not wreck the economy or defund defenses against the many other threats we face. This is justified by both environmental and economic reasons (see these posts for details). As we learn more about climate change, this program can be accelerated if necessary.
        4. Begin more aggressive efforts to prepare for extreme climate. We’re not prepared for repeat of past extreme weather(e.g., a major hurricane hitting NYC), let alone predictable climate change (e.g., sea levels climbing, as they have for thousands of years).

Conclusion

My pragmatic take based on these posts.  Climate change is an extraordinarily difficult problem to understand but the extremely bad projections are very unlikely.  Unfortunately those worst-case projections have the attention of a segment of society that is convinced otherwise and their passion may make reasonable and no regrets responses impossible.  Because we don’t understand natural variability well enough to pick out the small signal of human-caused global warming and, more importantly because the current alternatives to will be extremely expensive we need to monitor the climate better, focus our climate research on results and natural variability, develop a research program to develop alternative to fossil fuels that are cheaper than they are, and finally develop resiliency to observed extreme weather.

Climate Forecast Lessons from Dorian

Although I am a meteorologist with over 40 years of experience, I have been told that does not qualify me to have an “expert” opinion on the science of climate change.  Nonetheless, I believe my background and experience qualifies me to make a few points about the model-based projections of climate change relative to the forecasts for Hurricane Dorian.  Don’t ever forget that model projections are the basis for the “climate crisis” rhetoric that we are bombarded with on a daily basis.

A quick internet search found this very well done forecast for Dorian on August 29, 2019.  Meteorologist Tim Pandajis from WVEC Channel 13 in Norfolk, VA explains the current status of the storm on August 29, the forecast for the next several days, but also explains many of the reasons why the forecast is uncertain.  I particularly liked his explanation because it includes spaghetti plots.  At 8:04 in the video he shows how different models are seeing things differently and his presentation shows how different models predict how the storm will move and the timing.  Of course as it turned out Dorian behaved quite differently than any of the forecasts.

Given the constant changes to the forecasts for Dorian I am sure many recall the old saying that meteorology is the only profession where you can be wrong most of the time and still keep your job.  Reality is much different.  For me there are two things to keep in mind.  On September 1 the storm reached peak intensity but it also stalled.  The forecast intensity for the rest of the storm only went down when it became obvious that the storm intensity was going down.  The reason the intensity went down is that the hurricane sat in one place for so long that it brought cold water up to the surface.  Hurricanes need warm water to maintain intensity or grow and the cold water affected the intensity.  It is interesting that the models did not incorporate that effect or did not incorporate enough of that effect.  However, I am confident that the models will be revised to address that in the future.

When I graduated with my MS of in meteorology in 1976 three to five-day forecasts were not that good but they have improved a lot.  I ascribe that improvement in large part because weather forecasts are always being tested.  Whenever there is a poor forecast the models and the forecasters learn from that and improve their products going forward.  The climate forecasts that predict imminent and inevitable climate catastrophe do not have that advantage.  The National Weather Service defines 30-year averages as a climatic normal.  Using that time-period a climate model forecast should be tested against a 30-year weather average of observations.  Clearly there are many fewer opportunities to test a climate forecast model as opposed to a weather forecast. In addition, my experience with simpler models is that you can get the “right” answer for the wrong reason.  Weather forecast models address this problem by the large number of tests.  If they adjust the model for the wrong reason it may work once but the error will show up later so a different adjustment is tried until they get it right.  Climate models will never be able to correct if they have the wrong reason in our lifetimes.

The final lesson from Dorian is forecasting uncertainty.  As Tim Pamdajis showed with spaghetti plots in his presentation there was enough uncertainty to make a difference on hurricane response actions to take for the forecasts on August 29.  On the other hand, the climate model projections are portrayed in the media and by advocates as absolutely certain.  None of the caveats provided by the modelers are acknowledged in the hue and cry about a climate emergency.  The reality is that there are a range of modeled projections for future climate and, for the most part, only the most extreme impact results are publicized and those are the ones that are the basis for the “climate emergency”.

These lessons from Dorian support my belief that climate model forecasts cannot be trusted enough to believe that there is a climate emergency.  I am not alone.  Richard Lindzen commented on climate modeling for greenhouse gas effects:

“Here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.  This is an extraordinary pair of claims based on reasoning that borders on magical thinking.”

My takeaway message from Dorian.  Everyone has experience with weather forecast model predictions.  Intuitively I imagine most people have some suspicions about the validity of any predictions of the climate in 100 years.  This post illustrates reasons why those suspicions are well-founded.  In no way does that mean that the climate is not warming or that greenhouse gas emissions might not have an effect in the future.  However, in my opinion the imminent, inevitable climate catastrophe forecast is a very low probability for this and many other reasons.  If you want to do something to reduce potential climate impacts then do the “no regrets” like energy conservation and energy efficiency, and invest in research to make carbon dioxide free energy production cheaper than energy production from fossil sources which would make conversions a no regrets solution.  Unfortunately this is not the message from any of the Democratic candidates for President.

One final point relates to the effect of global warming on the storm itself.  I am sure you have heard the stories that Dorian supports the catastrophic concerns.  I don’t have time to address this in particular but I believe that the following refute the proposition that Dorian is somehow indicative of a global warming crisis.

    • Judith Curry “Alarmism enforcement” on hurricanes and global warming argues that there are a few climate scientists whose behavior “is violating the norms of science and in my opinion is unethical”. She also provides links to two papers from the World Meteorological Organization (WMO) Task Team on Tropical Cyclones that do not support the crisis allegation:

Tropical Cyclones and Climate Change Assessment: Part I. Detection and Attribution

Tropical Cyclones and Climate Change Assessment: Part II. Projected Response to Anthropogenic Warming

Connect New York “Climate Change in New York” Panel Discussion

Updated response from the host September 5, 2019 follows

On August 26,2019 Public Broadcasting Service WCNY Syracuse NY aired the Connect New York program “Climate Change in New York, a Changing Landscape”.   I stopped listening within the first two minutes because there were three gross mis-characterizations in that time and that was too much for me to swallow.  This post documents those three mis-characterizations.

Their description of the show states:

“Summer 2019 has been an illustration of climate change in New York – from a record breaking heat wave to flooding along the shores of Lake Ontario. In July, Governor Cuomo signed one of the most aggressive climate bills in the nation. We ask climate experts if the new law will be enough when the International Panel on Climate Change has warned that the world has 11 years left to act.”

In the opening monologue of the show host Susan Arbetter said: “Summer 2019 has been a graphic illustration of climate change from a record-breaking heat wave in France to flooding along the shores of Lake Ontario.”  After introducing the panel Ms. Arbetter referenced the UN Intergovernmental Panel on Climate Change asking Sandra Steingraber why we have to act quickly.  Dr. Steingraber said “Climate change now is a real emergency” and I stopped watching.  I believe that the heat wave and high water only represent extreme weather within the range of natural variability and that there is no climate emergency.   One of my pragmatic environmentalist’s principles is Alberto Brandolini’s  Baloney Asymmetry Principle: “The amount of energy necessary to refute BS is an order of magnitude bigger than to produce it.”  The explanation of the reason why Lake Ontario flooding is not an illustration of climate change exemplifies that principle.

If climate change were the cause of record Lake Ontario levels and resulting flooding then we would expect that there would be a trend of increasing lake levels.    That presumption is very easy to check. The US Army Corps of Engineers, Detroit Office provides monthly mean lake-wide average levels for all the Great Lakes.  The Great Lakes water levels 1918 to 2018 figure shows these data for all the lakes.  A quick scan does not reveal any obvious trend for Lake Ontario.  Moreover there are high lake levels in 1943, 1947, 1951, 1952, 1973, and 1974 as well has values in 2017 and the record breaking levels in 2019.

There is another factor to keep in mind relative to the Lake Ontario historical water levels.  When the Moses-Saunders dam on the St. Lawrence River was completed in 1958 it enabled some control of Lake Ontario water levels.  The International Lake Ontario – St. Lawrence River Board implemented Plan 2014 to ensure that releases at the Moses-Saunders Dam comply with the International Joint Commission’s 8 December 2016 Supplementary Order effective January 2017 entitled: Regulation Plan 2014 for the Lake Ontario and the St. Lawrence River Compendium Document.  I will not try to determine whether the dam had any effect on the recent high water levels but there are those that believe that is the case.

In order to determine if there is a possible trend I fit a linear regression model to determine if there was a statistically significant trend. I use Statgraphics Centurion software from StatPoint Technologies, Inc. to do my statistical analyses because it provides flexible plotting and regression tools.  Statgraphics enables the user to choose the best relationship from 27 different linear regression equations.  It is also nice because it presents clear summaries for the non-statistician like me.

I found the maximum monthly Lake Ontario water level for each year and plotted those values versus the year.  The Maximum Annual Monthly Lake Ontario Lake Levels 1950 to 2019 figure plots the water levels that have been coordinated with Canada from 1918 to 2018 and 2019 data through July that I extracted from the monthly reports.  According to the statistical program there is a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year because the P-value in the ANOVA table is less than 0.05.  I have listed the statistics and Statgraphics descriptions in Lake Ontario Annual Maximum Water Level Statistics 1950 to 2019.

At first glance host Susan Arbetter appears to be justified saying that Lake Ontario water levels are rising in response to anthropogenic climate change.  Based on their backgrounds I doubt that any members of the expert panel disagreed either. The expert panel consisted of Rachel May, NYS Senator who was an environmental sustainability educator at SUNY ESF with no science degrees; Sandra Steingraber, a Distinguished Scholar in Residence at Ithaca College where she writes about climate change, ecology, and the links between human health and the environment;  Mark Dunlea, founder of the Green Education and Legal Fund whose web page states that he is a graduate of RPI (Management) and Albany Law School; and Yvonne Chu a member of Climate Change Awareness and Action who has a BS in Environmental Science from SUNY Plattsburgh.

However there is an inconvenient fact.  The Intergovernmental Panel on Climate Change claims the effect of anthropogenic greenhouse gas emissions on the climate system “has a 95–100% probability of causing the currently observed and unprecedented warming of the climate since the mid-twentieth century”. As a result anthropogenic climate change could only have affected water level change after 1950. To test this I separated the Lake Ontario water level data into two sets: before and after 1950.  Maximum Annual Monthly Lake Ontario Lake Levels 1918 to 1949 figure lists the water levels from 1918 to 1949. According to the statistical program there is a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year over this time period because the P-value in the ANOVA table is less than 0.05.  I have listed the statistics in Lake Ontario Annual Maximum Water Level Statistics 1918 to 1949.

However, as shown in Maximum Annual Monthly Lake Ontario Lake Levels 1950 to 2019, the relationship is much weaker after 1950.  According to the statistical program there is not a statistically significant relationship at the 95% confidence level between Lake Ontario Maximum Monthly Level and Year over this time period because the P-value in the ANOVA table is greater than 0.05.  I have listed the statistics in Lake Ontario Annual Maximum Water Level Statistics 1950 to 2019.

Because there is no statistically significant trend after 1950, the disastrous flooding of 2019 is more likely weather related than indicative of climate change.  I refer you to another of my pragmatic environmentalist principles the Golden Rule of Climate Extremes.  Dr. Cliff Mass christened this rule as “The more extreme a climate or weather record is, the greater the contribution of natural variability”.  I am confident that were I to do the same kind of analysis for the French heat wave this summer it would be another example of this golden rule.

If you recall, Ms. Arbetter referenced the UN Intergovernmental Panel on Climate Change asking Sandra Steingraber why we have to act quickly.  She said “Climate change now is a real emergency”.  Again I refer you to Dr. Cliff Mass who has explained that climate change is probably not an existential threat.  He believes it is a serious problem and I agree.  Note, however, over-hyping the reality could very well come back and hurt the cause.

Ms.  Arbetter summed up the Lake Ontario flooding as “pitting the status quo against science”.  I have shown that her “science” was fatally flawed.  Her expert panel only included advocates without the technical expertise to differentiate between weather and climate.  Where does that leave the viewers who watched this show?  Eventually the public will catch on that this alleged imminent, inevitable climate emergency that requires costly and sweeping changes to society is not as advertised.

I am heartened that WCNY has not joined the Columbia Journalism Review “Covering Climate Now” effort.  However, this Connect NY program was entirely consistent with the intent of that effort to strengthen the media’s focus on the climate crisis.  According to the Connect NY web page the program offers “insightful discussion, information, and analysis on timely topics that affect residents across the Empire State”.  However, it seems to me the program was not an honest attempt to present both sides of this topic but rather a platform to present opinions of one side of this issue.

Update: I sent a letter to the station with these explanations.  I received the following response on September 5, 2019:

Dear Roger,

I appreciate your email.  The climate program that aired on WCNY in August was the second “Connect: NY” program we have produced on the issue.  The first program aired on February 25th and featured the climate debate from the business perspective.   If you watch both of them, I think you’ll have a fuller appreciation of the range of perspectives we have featured on the air on this issue.

Thank you again for engaging.

warmly,

Susan Arbetter

 

 

Five Reasons Why the Catastrophic Anthropogenic Global Warming Story is Wrong

Scott Adams (of Dilbert comic fame) recently did a video about climate persuasion titled “Scott Adams solves the climate debate and saves the world (really)”, available here on periscope or here on twitter that has sparked folks to distill their arguments for and against climate change. After hearing about this and faced with the constant barrage of media stories about the latest inevitable climate doom story I thought it would be appropriate for me summarize five reasons why I think that the political agenda to transform the energy system of the world is not supported well enough by sufficient scientific evidence to proceed.

The majority of the claims that anthropogenic carbon dioxide emissions are the cause of the observed warming and that we only have a short time to do something or else are based on projections from global climate models (GCM). As noted below I do have relevant experience, education and background to inform my opinion. However, I believe that anyone who does research based on their personal experience, background and education can reach their own informed opinion of these claims if they actively try to get both sides of the story.

First a bit of background of these models. If you want details, Dr. Judith Curry did a detailed overview that includes a summary description. For my purposes all you need to know is that these models are a variation upon the meteorological models that provide predictions that everyone uses when you make decisions based on weather forecasts. There are differences but they use the same physical relationships such as momentum, conservation of heat and conservation of mass.

I have M. S. and a B. S. degrees in meteorology and in my fourth semester of Weather Analysis & Forecasting the laboratory assignment was to break off into teams and write a simple weather forecast model. I have been inside this kind of model but most readers also have some mathematical background that is relevant. In particular you may remember in algebra that if you have three equations you can solve for three unknowns but if you only have two equations you cannot solve them. The problem in meteorological models is that you have more unknowns than you have equations so model developers have to improvise. In particular, instead of using direct relationships for every single factor that affects weather or climate forecasts, meteorologists use parameters to simulate the effects of some atmospheric processes.

The first reason that I am skeptical of any GCM results is the use of parameters which can be thought of as “fudge factors’.   Model developers necessarily have to account for some things that cannot be modeled directly. John von Neumann allegedly summed up the problem stating that “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”[1]. In other words, he could develop a mathematical model that described an elephant simply by fudging the parameters. Everyone who makes a decision based on a weather forecast has learned that you can trust a forecast for tomorrow better than one several days away. In the 43 years since I graduated the forecasts have become more reliable for dates further in the future because weather forecasters have constant feedback and have been able to adjust the parameters in the meteorological models to improve forecasts based on observations. There is only one global climate system and forecasts made today for 100 years away cannot be checked until 100 years have passed. One insurmountable problem is that the parameters and their use in GCMs cannot be verified as correct in our lifetimes.

Another issue with the parameters is that the focus is on just one of these parameters. Richard Lindzen commented on this:

“Here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.  This is an extraordinary pair of claims based on reasoning that borders on magical thinking.”

My final difficulty with parameters in GCMs is that they are used to model clouds. Dr. Curry explains that in order to solve the physical equations in a global climate the world has to be divided up into a three-dimensional grid. The equations are calculated for each grid cell and repeated to generate a forecast. My particular problem is that the grid cell size needed in order to do these calculations are on the order of 100 km horizontally, the vertical height is often 1 km and they do the calculations every 30 minutes or so. As a result, the models cannot simulate clouds. That single parametrization is a big enough driver of climate that this model component alone could dominate the GCM projections. This uncertainty is well understood in climate science by those who have worked with these models. However, the problems with parameterization is not well understood and its ramifications on the policy decisions is poorly understood by most of those who advocate eliminating fossil fuel use.

My second reason for not trusting these models is related to my experience in air pollution meteorology and work I did as a consultant to EPA evaluating the performance of air quality dispersion models. Complexity-wise those models are orders of magnitude simpler than climate models and they are simple enough to be directly verified. All air quality dispersion models are based on results from field studies that injected tracers into stack effluents, measuring the downwind concentrations in a test array under a wide array of meteorological conditions and then developing coefficients for pollution dispersion. I worked on a project where we compared the known emissions and observed ambient concentrations to model projections at power plants. The results showed that the models in use at the time adequately predicted maximum concentrations so EPA was comfortable that they were working correctly. The frightening result in my mind is that it was not uncommon for the model to predict a maximum concentration close to the maximum observed value but the meteorological conditions for the predicted maximum would be different than the meteorological conditions for the observed maximum.

Consider the differences between the GCMs and air quality models. Air quality models use parameters that are based directly on observations, incorporate known emissions, and have been extensively verified in field studies but those models could get the right answers for the wrong reasons. GCMs use parameters that are based on model developer opinions, have to estimate future emissions, and cannot be tested in the climate system. We are supposed to believe the models are getting the right answers for right reasons. I don’t think that is a reasonable assumption for modeling that is the basis for converting the entire economy away from fossil fuels.

My third reason for not accepting the commonly held belief that catastrophe is inevitable is that the projections that make that claim are only one answer in a wide range of potential outcomes. Consider for example that the Intergovernmental Panel on Climate Change (IPCC) does not give a single value for the sensitivity of atmospheric temperature to carbon dioxide. Instead they give a range of potential warming for a doubling of atmospheric concentrations of carbon dioxide of between 1.5 and 4.5 degrees C. Nic Lewis does a nice job discussion climate sensitivity here.  The GCM projections cover a wide range of potential outcomes from benign to catastrophic. Without going too deep I want to point out that the damage claims for increased carbon dioxide depend on the shape of the distribution of this sensitivity. Damage affects society when costs are greater and those estimates are strongly affected by the probability of extreme damages. The first problematic aspect of this issue is that the projections for high impacts rely on a relatively high probability of extreme impacts. Although recent research has shown that the likelihood of extreme outcomes is lower than previously thought, those results have not been incorporated into the damage estimates. If they were considered, then the costs currently claimed would be reduced.

More problematic to me than the range of possible model outcomes is the use of the worst-case representative concentration pathway as the business as usual scenario. The IPCC developed a set of four pathways to concentrations representing the range of radiative forcing (i.e. greenhouse effect) in the literature for 2100. The range of possible future atmospheric forcing levels runs from relatively low levels of carbon dioxide to the highest forcing they thought possible. The problem is that these concentrations had to be related back to emission scenarios and the worst case representative concentration pathway with a forcing of 8.5 watts per meter squared is so high that the emission scenario necessary to get that level is not credible. Many of the really scary projections that get the headlines and dominate the narrative why we need to reduce carbon dioxide emissions use RCP 8.5 as business as usual. We already know that the likelihood of that future emission scenario is extremely unlikely. When coupled with models that give a range of outcomes from benign to problematic, I can only conclude that while it is not impossible that there could a catastrophic impact the probability is so low that it should not drive policy decisions.

My fourth reason for not trusting the models that claim carbon dioxide is the primary driver of the recently observed warming is that even for the limited results we have from the models that can be compared to climatic system they don’t do well. Dr. Curry has explained that inconsistency well:

Between 1910 and 1940, the planet warmed during a climatic episode that resembles our own, down to the degree. The warming can’t be blamed on industry, she argues, because back then, most of the carbon-dioxide emissions from burning fossil fuels were small. In fact, Curry says, “almost half of the warming observed in the twentieth century came about in the first half of the century, before carbon-dioxide emissions became large.” Natural factors thus had to be the cause. None of the climate models used by scientists now working for the United Nations can explain this older trend. Nor can these models explain why the climate suddenly cooled between 1950 and 1970, giving rise to widespread warnings about the onset of a new ice age.

The final reason that I believe that the political agenda to transform the energy system of the world is not supported by sufficient evidence to be credible is the suggestion that climate change is easy to solve because renewables are a viable solution. As convinced as I am that the climate science does not support this agenda I believe the suggestion that wind and solar can solve our energy problems is even more of an exaggeration. In fact that issue is the primary driver why I blog and have written so much about New York’s climate change plans. However, don’t listen to me, listen to Bill Gates who states “The idea that we have the current tools and it’s just because these utility people are evil people and if we could just beat on them and put (solar panels) on our rooftop—that is more of a block than climate denial,” Gates said. “The ‘climate is easy to solve’ group is our biggest problem.” Another problem is that the renewable “solution” very likely has very significant environmental impacts that are generally ignored. Michael Shellenberger had a Ted talk “Why renewables can’t save the planet” that addresses this issue.

To sum up. The rationale used to justify the need to convert the energy system of the world is that carbon dioxide will cause inevitable catastrophe and we can be saved if only we implement renewable wind and solar which will be easy to do. I don’t believe the science supports inevitable catastrophe because those projections are based on global climate models. Those models use too many fudge factors that can give too many results that can never be tested, much simpler models that are based entirely on observations can give the right answer for the wrong reason and the model results to date do not adequately predict the one climate experiment we can test. Most of the catastrophic outcomes that dominate the political and media narrative depend on an emissions scenario that is not credible. I do not believe that diffuse and intermittent solar and wind can be used to replace reliable and affordable electric power much less generate enough energy to convert transportation, heating, and industrial use of fossil fuel to electricity.

[1] Attributed to von Neumann by Enrico Fermi, as quoted by Freeman Dyson in “A meeting with Enrico Fermi” in Nature 427 (22 January 2004) p. 297

 

Recommended Read: Global Warming for the Two Cultures

I have the education, background and experience to independently evaluate the constant drum beat claiming imminent and inevitable climate catastrophe if we don’t immediately reduce our carbon footprint. I am a luke-warmer who believes that the sensitivity of climate to anthropogenic carbon dioxide emissions is at the bottom of the Intergovernmental Panel on Climate Change range. At that level, climate catastrophe is a very unlikely possibility and the effect is much more likely to be benign.

Unfortunately it is very frustrating to hold my position because the media, politicians and advocacy groups have convinced many that we have to use renewables as a “solution” to what I think is a non-existent problem. As a result I am always looking for a good summary of the issues that I have with the imminent climate catastrophe narrative. The 2018 Global Warming Policy Foundation Annual Lecture: “Global Warming for the Two Cultures” by Dr. Richard Lindzen is an excellent summary that I recommend to those who believe that we need to transform the energy system to do “something” about climate change so that they will have at least heard the other side of the story.

Lindzen begins his talk by describing two cultures in society and the implication of that on policy decisions. Basically the two cultures are those that understand the “science” in general and physics in particular and those that don’t. He explains why this understanding gap is a problem:

While some might maintain that ignorance of physics does not impact political ability, it most certainly impacts the ability of non-scientific politicians to deal with nominally science-based issues. The gap in understanding is also an invitation to malicious exploitation. Given the democratic necessity for non-scientists to take positions on scientific problems, belief and faith inevitably replace understanding, though trivially oversimplified false narratives serve to reassure the non-scientists that they are not totally without scientific ‘understanding.’ The issue of global warming offers numerous examples of all of this.

One of my problems with the media climate change story is that the greenhouse effect is simple. His lecture describes the complicated climate system in enough detail to support my contention that the inevitable climate catastrophe is imminent story is an over-exaggeration.

I particularly like his description of the popular narrative we hear from the media and politicians:

Now here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance.

This is an extraordinary pair of claims based on reasoning that borders on magical thinking. It is, however, the narrative that has been widely accepted, even among many sceptics.

He then goes on to describe how he believes the popular narrative originated and de-bunks the evidence we constantly reminded supports the catastrophic narrative.

I encourage you to read the entire lecture. I believe it supports his concluding summary of the situation:

An implausible conjecture backed by false evidence and repeated incessantly has become politically correct ‘knowledge,’ and is used to promote the overturn of industrial civilization.

Temperature Related Deaths

Environmental advocates claim heat results in more deaths than any other weather-related event but I recently read a conflicting claim about weather-related deaths. The New York City Environmental Justice Alliance released a new report, NYC Climate Justice Agenda 2018 – Midway to 2030: Building Resiliency and Equity for a Just Transition, that claims “Extreme heat results in more deaths than any other weather-related event”. On the other hand, a study in Lancet, “Mortality risk attributable to high and low ambient temperature: a multicountry observational study”, notes that “most of the temperature-related mortality burden was attributable to the contribution of cold”. I did some research and now I think I know what is going on for these two differing claims.

The NYC Climate Justice Agenda bases their claim that extreme heat causes more deaths than cold based on an EPA reference. The EPA extreme heat webpage uses data from the National Oceanic and Atmospheric Administration Natural Hazard Statistics: Weather Fatalities website. The home page for that site lists 508 fatalities from all weather events in 2017, including 107 from extreme heat, 26 from extreme cold, 10 from winter storms, 1 from ice, and 3 from avalanches. Those data that show that the more people died due to extreme heat than other cause, narrowly beating out flash floods, and that more people die from heat than cold-related events. The data for the website are compiled from information in the National Weather Service (NWS) storm events database.

The Global Warming Policy Foundation April 9 2018 newsletter reported that 48,000 Britons died this winter due to cold weather.   Those numbers are obviously far different than the NWS data. The Lancet paper by Gasparrini et al. notes that:

Although consensus exists among researchers that both extremely cold and extremely hot temperatures affect health, their relative importance is a matter of current debate and other details of the association remain unexplored. For example, little is known about the optimum temperatures that correspond to minimum effects for various health outcomes. Furthermore, most research has focused on extreme events and no studies have comparatively assessed the contribution of moderately high and low temperatures. The underlying physiopathological mechanisms that link exposure to non-optimum temperature and mortality risk have not been completely elucidated. Heat stroke on hot days and hypothermia on cold days only account for small proportions of excess deaths. High and low temperatures have been associated with increased risk for a wide range of cardiovascular, respiratory, and other causes, suggesting the existence of multiple biological pathways.

I believe that the reason for the difference in the two conclusions is explained by this statement by Gasparrini et al.: “The dose-response association, which is inherently non-linear, is also characterised by different lag periods for heat and cold—i.e., excess risk caused by heat is typically immediate and occurs within a few days, while the effects of cold have been reported to last up to 3 or 4 weeks.”

According to the NWS instructions for storm data preparation the storm data report documents:

  • The occurrence of storms and other significant weather phenomena having sufficient intensity to cause loss of life, injuries, significant property damage, and/or disruption to commerce;
  • Rare, unusual, weather phenomena that generate media attention, such as snow flurries in South Florida or the San Diego coastal area; and
  • Other significant meteorological events, such as record maximum or minimum temperatures or precipitation that occur in connection with another event.

The key point is that the storm data report makes a distinction between direct and indirect deaths.  Only direct deaths are tabulated when a local weather office prepares the storm report. For example, in winter storms deaths from heart attacks from shoveling snow are indirect.  If a person wanders outside and freezes to death that’s a direct death. Furthermore, while indirect deaths are included in the storm narratives the numbers are not included in the tabulated data and storm reports are prepared within days of the event so any indirect deaths due to excessive cold caused by weeks-old impacts would not be included. Details on the difference between direct and indirect deaths are found in the instruction document on pages 9 to 12.

In their study of Gasparrini et al. found that temperature is responsible for 7.7% of mortality. Cold was responsible for “most of the burden”. Although in the study over 90% was attributed to cold the paper noted that “This difference was mainly caused by the high minimum-mortality percentile, with most of the mean daily temperatures being lower than the optimum value”. I interpret that to mean that some of the difference was due their classification methodology. In line with the indirect death distinction it is notable that over 85% of the mortality attributable to temperature was related to moderate cold. Offhand I think there must be more causes of death associated with freezing weather than hot weather. For example, auto accidents on icy roads has to cause more deaths than any hot weather impact on travel.

In conclusion, there is a data base that does show that extreme heat results in more deaths than any other weather-related event. However, the database used to justify that claim only includes direct deaths. An epidemiological study that does include indirect deaths concludes the majority of deaths are associated with moderate cold weather.

Relative to climate change policy the distinction between heat and cold is important. If the argument is that we must mitigate human impacts on climate to reduce mortality due to temperature than because a warming climate will result in less moderate cold then that means warming will have a beneficial effect. An unintended consequence of climate change mitigation through the implementation of renewable energy is the universal increase in cost. Given the impacts on indirect deaths I believe that increased heating cost will adversely affect mortality if low income people cannot afford to keep their homes warm enough to prevent potential health impacts of cold weather. Finally, the fact is that climate is a reason many more people move to Phoenix AR than move to the “ice box of the nation”, International Falls, MN, suggests we are better able to adapt to warm than cold.

Climate change soon to be main cause of heat waves in West, Great Lakes

A recent study entitled Early emergence of anthropogenically forced heat waves in the western United States and Great Lakes was publicized in the Syracuse New York Post Standard under the headline Upstate NY among first to have most heat waves due to climate change. Unfortunately, as Blair King writes “it actually represents a quite excellent example of how science is misrepresented to the public in the climate change debate.”

According to a press release: “Lopez and colleagues used climate models along with historical climate data to project future heat wave patterns. They based their findings on the projection for greenhouse gas emissions this century, known as the RCP8.5 scenario. This assumes high population with modest rates of technological change and energy conservation improvements and is often called the “business as usual” scenario. Lopez said he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.”

My concern and that of Blair King is the use of the RCP8.5 scenario. This is a representative concentration pathway that represents a forcing of 8.5 watts per meter squared that is used by climate modelers to represent the worst case atmospheric effect of greenhouse gases by 2100. Essentially this emissions scenario was developed to provide that forcing level.

Larry Kummer looked at the scenario in detail. He notes that “It assumes the fastest population growth (a doubling of Earth’s population to 12 billion), the lowest rate of technology development, slow GDP growth, a massive increase in world poverty, plus high energy use and emissions.” His post explains that RP8.5 assumes population growth at the high end of the current UN forecasts, assumes that the centuries long progress of technology will slow, and assumes no decarbonization of world power sources from new technology (e.g., solar, wind, fission, fusion) or regulations to reduce not just climate change but also air pollution and toxic waste.

Blair King explains that RCP8.5 has a storyline that describes the assumptions of the scenario in easy to understand language. He goes on to explain that the RCP8.5 scenario dates back to 2007 and is characterized by the following:

  • Lower trade flows, relatively slow capital stock turnover, and slower technological change;
  • Less international cooperation than the A1 or B1 worlds. People, ideas, and capital are less mobile so that technology diffuses more slowly than in the other scenario families;
  • International disparities in productivity, and hence income per capita, are largely maintained or increased in absolute terms;
  • Development of renewable energy technologies are delayed and are not shared widely between trade blocs;
  • Delayed land use improvements for agriculture resulting in increased pollution and increased negative land use emissions until very late in the scenario (close to 2100);
  • A rebound in human population demographics resulting in human population of 15 billion in 2100; and
  • A 10 fold increase in the use of coal as a power source and a move away from natural gas as an energy source.

Consider those assumptions against what actually has happened since 2007. I am not sure about the status of international disparities in productivity and land use improvements. However, I believe all the other parameters are not following those assumptions. Global trade is at all time highs, renewable technology is freely traded, renewable technology continues to mature and develop, and there is no sign of human population growth accelerating to reach 15 billion. Most importantly, this scenario pre-dates the fracking revolution that has flipped the use of coal and natural gas in the United States by making natural gas so cheap and plentiful. There is no reason to believe that the technology won’t expand elsewhere and markedly reduce any potential increase in the use of coal as a power source.

Lopez states that “he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.” He is either ignorant of the substantial change in greenhouse gas emissions observed in the United States or willfully ignored those numbers to misrepresent the science to the public.

Smithsonian Capture the Sun Harness the Wind

I am so tired of the Smithsonian’s unquestioning devotion of renewable energy in spite of obvious warning signs that I wrote a letter to the editor. In the April 2018 Smithsonian there is an article entitled “The Future’s so Bright (He’s Gotta Wear Shades)” by Dan Solomon and a related graphic article “Capture the Sun, Harness the Wind” by 5W Infographics. These articles are essentially puff publicity pieces for the renewable energy industry that clearly shows the bias in Smithsonian on renewable energy. Nonetheless they cannot escape inconvenient facts.

The most obvious problem with Solomon’s article on the bright future of renewable energy in Georgetown Texas is that renewable energy looks great for the early adopters but the reality of a 100% reliable electric system lies beneath that success. The situation is exactly the same as a pyramid scheme where the first ones in reap benefits. When Solomon’s article notes that “about 2% of the time the Georgetown utility draws electricity derived from fossil fuels”, an unbiased article would have followed up on the implications of that. The primary support for the fossil fuels necessary to keep Georgetown lights on comes from everybody else. As more rent-seekers pile onto the renewable energy bandwagon pyramid the costs necessarily increase for those on the outside. As you dig deeper it becomes apparent that price support for the rest of the electric system not only becomes more likely but because solar and wind don’t support grid services it becomes increasingly likely that another layer of support has to be added at some point over 30% renewable penetration. At this time Georgetown is not paying for that.

The first graph in the graphic article shows “The comparison to coal” which charts the 2016 actual coal and renewable sources electricity generation and projects changes in their use out to 2050. Comparing the 2016 coal use of 1,216 billion kwhr and 573 billion kwhr renewable source estimate of 573 with EIA numbers shows that those numbers are close enough to not quibble. However, the title of the article refers to sun and wind and the electricity generation in those categories is lumped together with hydro, biomass, geothermal, and other gases. As far as I can tell solar and wind account for less than half of the 573 billion kwhr number. On the other hand most of the future renewable growth will occur in the wind and solar sectors but the graphic does not provide that information. Neither article mentions just how much wind and solar generation will be needed to meet the projected 2050 number.

Another graphic notes that 800 MW of energy storage were built in the United States in the last five years and expects that amount will be built in just 2020. The important number is how many MW hours will be available from the energy storage built because that defines how much storage will be available to counteract renewable’s intermittency. Solomon’s article also did not address how much storage would be needed for Georgetown to get off the grid. Neglecting to point out that because intermittent renewables struggle to generate power over a third of the time we will likely have to over-build renewable capacity and add massive amounts of energy storage biases the renewable argument.

One of the inconvenient facts illustrated but not noted in the graphic article is jobs per energy produced. If you divide the number of coal-industry employees in 2016 into the total coal generation you get 24.3 million kWh produced per employee. If you divide the sum of the solar and wind employees in 2016 into half of the reported renewable sources generation you get 0.8 million kWh produced per employee. Coal is 30 times more man-power efficient. While that may be good for employment it does not portend well for cost.

Other than the fact that the duck curve is graphically interesting I am not sure why that was included in the graphics article. More importantly it illustrates a problem. When you have large amounts of solar on the system something has to be available to make up for the evening demand. That is where storage becomes necessary. In order to keep the lights on you also need enough storage to cover those days when there isn’t any sun. Dale Ross’s flippant we are in West Texas so “Cloudy, Really?” comment aside a quick check of the climatological data indicates that it is mostly cloudy 28% of the time in Georgetown. Obviously despite the claim that Georgetown is powered entirely by renewable energy the fact is that is not true.

The Solomon article has multiple instances of conveniently neglected facts to make the story. It notes that the City was able to get guaranteed rates of 20 years for wind power and 25 years for solar power. It would have been appropriate to note that these types of facilities have very little operational experience that long so the guarantees might not be as iron-clad as implied. Solomon quotes Adam Schultz as saying that solar and wind have gotten so much cheaper that “I can’t even tell you the costs because costs have been dropping so rapidly”. If that is the case then why do both solar and wind need direct subsidies? Finally, blowing off the environmental impact of renewables on birds by saying that more birds are killed by cats and buildings reminds me of the two wrongs don’t make a right parable. Furthermore, what about the bats and just how many raptors are killed by house cats? The fact is that because renewable energy is diffuse, wildlife issues are a legitimate concern.

Those are the superficial errors illustrating biases. The reality is that because wind and solar are diffuse the electric grid is essential for keeping the lights on. Digging down into this problem is more complicated but necessary for the complete unbiased story of renewables. I recommend this post by Planning Engineer at the Climate Etc. blog for an overview of the transmission planning difficulties raised by wind and solar energy. In brief, the modern power grid is a connected complex machine that balances various electrical and mechanical properties over thousands of miles. The system must be stable that is to say stay in synchronism, balance loads and generation and maintain voltages following system disturbances. The grid is built upon heavy rotating machinery at hydro, nuclear, and fossil-fired generating stations that provides that stability. Wind and solar power do not provide any stability support. Up to some point the present day grid is resilient enough to overcome that problem but at some point it has to be addressed. I don’t doubt that it can be addressed successfully but the costs necessary to do that are unknown and were certainly not a consideration in either article.

The reality of solar and wind renewable power not addressed in this article is that it is likely only to completely supplant fossil fuels in limited locations where both solar and wind potential are high, industrial load requirements are negligible, and the weather is mild in the winter because both resources are intermittent and diffuse. Texas has large wind and solar resources because of geography and because it is so large there is enough space to generate significant amounts. Georgetown TX does not have heavy industry that requires high amounts of electricity constantly so they can pretend to be powered entirely be renewable energy. Finally, Georgetown does not have to contend with winter impacts of higher latitudes particularly home heating. The solar resource is much reduced simply because the days are shorter but you must also consider reductions due to snow covered rooftop solar cells.