Weathermen Accuracy Studies

People blame weathermen for being wrong frequently. Of course, most of that is only based on anecdotal confirmation bias. No one ever mentions the weatherman when he was right. Even worse, people will claim that that the weatherman was wrong, even when he wasn’t. If a prediction of 10% chance of rain is made, that doesn’t mean it won’t rain, it means it will rain one out of ten times. It’s only if it rains significantly more (or less!) than 10% that’s wrong.

I’m sure that studies have been done on the accuracy of meteorology.

Who does these tests? What are there results? Has meteorology gotten significantly better in the past thirty years or so?

The Weathermen were notoriously inaccurate, often killing bystanders while doing no harm to their targets; sometimes, they even blew themselves up.

Oh, THOSE weathermen. The accuracy of weather predictions depends very much on what area the prediction is for, what time span and how far ahead, what’s being predicted, and who’s predicting it. For instance, here’s some stats for San Francisco, from Forecastadvisor.com (I believe these numbers are for temperature forecasts):

Last Month:
National Weather Service 75.60%
The Weather Channel 74.17%
NWS Digital Forecast 74.17%
AccuWeather 73.33%
CustomWeather 73.33%
Intellicast 69.44%

Last Year:
AccuWeather 81.79%
The Weather Channel 81.32%
CustomWeather 81.17%
National Weather Service 80.53%
NWS Digital Forecast 79.29%
Intellicast 75.61%

And some for Tulsa, OK:

Last Month:
National Weather Service 77.01%
The Weather Channel 72.99%
Intellicast 72.50%
AccuWeather 71.94%
NWS Digital Forecast 71.94%
CustomWeather 71.39%

Last Year:
National Weather Service 71.37%
Intellicast 70.47%
The Weather Channel 70.45%
AccuWeather 66.46%
CustomWeather 66.43%
NWS Digital Forecast 65.52%

How do you measure the accuracy of temperature forecasts in percentage points?

You could conceivably say that if the forecasted high temperature for a day was within some epsilon of the measured high temperature, then that counts as a pass; otherwise it’s a fail. The number of passes divided by the number of days in the reporting period is the accuracy.

Of course, there are many other ways.

I think that’s ultrafilter’s point. The statistics given by Nametag don’t mention their methodology.

Cecil actually covered this topic and mentions some numbers for storm-prediction accuracy. (This one is basically a boolean so it’s easy.)

I would like to see some accuracy statistics compiled for high/low temps.

How far out? Weather prediction is very accurate for 24 hours ahead, something like 70% accurate at 48 hours and by the time you get to 5-7 days, something like 10% accurate. These are off the top of my head, but it’s to illustrate that you’re predicting what a chaotic fluid will do in the future and accuracy suffers the farther you try to predict. If you’re testing people’s accuracy with 5-day forecasts, everyone will do terrible. 80% of what my local weatherman says is probably wrong, but that %20 where he predicts tonight and tomorrow’s weather is often dead on.

The storm, on its way, has been predicted at 8 to 18 inches of snow. The difference is huge. But with that range they can declare they were dead on in a large range of precipitation.

two spots 10 miles apart could have a difference of 10 inches of snowfall. hugeness happens.

Closeley realted thread from 5 years ago Why are short term weather forecasts still so iffy? - Factual Questions - Straight Dope Message Board

Shame I can’t type well today …
“Closeley realted” should of course be “Closely related”.

The storm is here. In the 50 mile radius that my TV stations cover (100 miles from one end to the other), snowfall ranges from none at all to coming up on 12 inches, with more on the way. Likewise, temperatures right now are between 24 and 34 degrees.

That 50 mile radius translates into an area of 7,854 square miles. Asking anyone for 100% accuracy in that large an area is a tall order.

With respect to the last question in the OP: Weather forecasts have become hugely more accurate over the past 40 years or so. It’s still possible for a storm to track differently from what was expected, or for temperatures to be just over rather than just under the freezing mark, resulting in rain rather than snow (or vice versa), but generally it’s clear that meteorologists know what they’re talking about and there’s usually a good explanation if they’re wrong. Not so in the old days. I think that weather satellites in particular have made a big difference.

This is just my subjective impression as someone who has lived through that period; I’d be interested in hearing from someone who knows more about the improvement in weather forecasting.