When was the last time you heard a newscaster say that it is an average snowfall year? Probably never! The reason is two fold. First, "average" does not make good news or should I say "it is not newsworthy." Second, the variation in year-to-year snowfall is large and therefore rarely do we have a yearly snowfall that is "average."

What no one ever talks about is that over a period of 5, 10 or 20 years the average snowfall is within a much narrower range, but again that isn't newsworthy! This article takes a realistic look at the variation in snowfall. The calculations use snowfall data for Donner Summit from 1946 through 2008.* The following chart shows the data and the large variation from year to year is clearly evident.

The "mean" of a set of data is the sum of all of the data points divided by the number of data points in the set. The mean is commonly referred to as the "average." The average for the 63 years of data is 404 inches of snowfall.

The "standard deviation" of an array of data is a measure of the dispersion of data about the mean (average) value. A low standard deviation indicates that the data is clustered around the mean, whereas a high standard deviation indicates that the data is widely spread with significantly higher and lower figures than the mean. If M is the mean and SD is the standard deviation, then 68 percent of all the data points will lie within the range of M ± SD and 95 percent of all the data points will lie within the range of M ± (2 x SD).

The following table shows the standard deviation and standard deviation as a percentage of the mean for several cases. The first row is for the raw data — each year is treated as an individual data point. The subsequent rows are for data created by averaging the raw data over 5, 10, 20 and 30 years. For example, a new set of data is created for the 5-year case by averaging years 1946 through 1950, then averaging years 1947 throuh 1951, etc. and the standard deviation and percentage of the mean are then calculated for the new set of data. Each set of data that is created by averaging is commonly referred to as the N-year running average.

Years Over Which Data is Averaged |
Standard Deviation (inches) |
Standard Deviation as Percentage of Mean (percent) |

1 | 129 | 32 |

5 | 46 | 11 |

10 | 24 | 6 |

20 | 14 | 3 |

30 | 9 | 2 |

Consider the results calculated for the raw data (1-year averaged). The mean is 404 inches and the standard deviation is 129 inches (32% of the mean). This indicates that the dispersion of the data is large. The range for one standard deviation is from 275 to 533 inches (68 percent of all the data points lie in this range). Even 2007, which was considered a very low snowfall year, lies within this range.

Two standard deviations is 258 inches. That means the snowfall range in order to contain 95% of all data points is 404 inches ±258 inches or 146 to 662 inches. That's huge and says that what is viewed as low or high snowfall years is in fact within the norms of the data.

When the running average is considered for 5, 10, 20 and 30 years the standard deviation is reduced dramatically. For 5 years it is only 46 inches or 11% of the mean, for 10 years it is only 24 inches or 6% of the mean, etc. This is what few people ever realize — over even a modest number of years, say 5 or 10, the snowfall is quite consistent and without the radical extremes that the news media hypes.

For illustration, since a picture is worth a thousand words, the following chart of the 5-year running averages is presented and exhibits markedly less dispersion than the raw data shown in chart at the top of the page. The vertical scales are the same. [Please excuse the poor quality of this chart. The program used to generate it does not allow the scales, the line widths and the size of the text to be controlled separately.]

The conclusion to be drawn from the above analysis is that an exceptionally low snowfall year is in fact within the range of the dispersion of historical data. It goes without saying that successive low snowfall years may result in water shortages. However, the magnitude of the water shortages and how often there is a water shortage is growing not because of an increase in number of low snowfall years or the severity in low snowfall years, but rather because the demand for water in California has grown dramatically over the years.** Demand in California is such that given our current water storage capacity we can't tolerate even normal fluctuations in snowfall. This has become worse as demand has met or exceeded supply even in average years.

The definition of "drought" refers to an extended or prolonged period of scanty rainfall or in the foregoing analysis scanty snowfall. That has not been the case. But the media often proclaims we are in a drought because they are ignoring the true data and the normal statistical variations in snowfall (and rainfall) as well as ignoring that the shortage of water is due to self-generated increased demand.

** Climate change may also have an affect on water supply through its affect on moisture content of the snow, melt cycle and other factors. A discussion of climate change is beyond the scope of this article.