Utah has not has a year with a statewide average temperature below that of the 20th century mean since 1993.
|
Source: NCDC |
That is quite a run and it reflects a warming climate.
However, so far this year is very close to that 20th century mean, so the next several weeks will determine if the streak will continue. In northern Utah, we had a remarkably cold January followed by a remarkably hot summer. The extremes in southern Utah haven't been quite so pronounced. When you put all of this together, the statewide average temperature for January–September is just a shade (< 1ºF) above the mean for the same period during the 20th century.
|
Source: NCDC |
If we can eke it out, that would give us 20 consecutive years with statewide average temperatures above the 20th century mean. If Utah's climate wasn't changing and remained equivalent to that of the 20th century, the odds of that occurring are about 1 in a million (basically, the same as flipping a coin and getting a head 20 times in a row). Such a long run of above average annual temperatures is yet another indicator that the climate of the past couple of decades is not the same experienced by Utah residents during most of the 20th century.
In a simplified scenario (such as a clear, dry atmosphere typical of Utah), is it fairly easy to estimate the difference in radiative heat loss based on differing CO2 concentrations? For example, what percentage of the radiative heat loss would occur with 400 ppm CO2, in comparison to 200 ppm, for a 60 F surface at 850mb? Or, how much would the radiative equilibrium temperature change between 200-400 ppm CO2 with these simplified assumptions? Obviously the real world is much more complex than this, but I think this type of simplified calculation would be interesting if anyone knows how to do it.
ReplyDelete