One site says average temperatures are calculated by averaging temperatures taken ‘on the hour’. I suppose that is good enough but many PWSs take readings every five minutes or so and this got me to wondering how it is done officially.
There must be an *official *definition of weather averages somewhere.
There may be many methods. One common method is to average the high & low hourly temperature readings to establish that day’s "average. Over long periods of time (months, years) it is a statistically accurate measure.
Are you asking about the average (or mean) temperature for the day? In Canada at least, it’s simply the average of the daily high and low. Sounds crude I know, but that’s the way it is.
I am thinking that in the US there must be an *official *definition of average. US bureaucracies often spend a lot of money on writing such official definitions of such things.
I have found a link to NOAA’s definition of “Definition of Above, Near, and Below Normal Categories” but I cannot find their algorithm or definition for average temperature.
Are you asking about a longer-term climatic average? Like based on past records, the average high in Des Moines in September is 65.1F? That sort of average? If so, those sorts of averages are usually based on a 30-year period of record - currently 1971-2000. Next year, these climate normals will be updated to 1981-2010. (Ref: http://lwf.ncdc.noaa.gov/faqs/climfaq23.html)
Thanks.