Just wondering how accurate a 10 day extended forcast could possibly be? I mean do meteorologists know that a week from this wednesday it’s going to be totally crappy weather.
How accurate is weather forcasting?
Just wondering how accurate a 10 day extended forcast could possibly be? I mean do meteorologists know that a week from this wednesday it’s going to be totally crappy weather.
How accurate is weather forcasting?
I’m not sure I have any definitive answer, but would suspect that accurate forecasting depends largely on topography and climate. When I lived in Southern Cal., for instance, it seemed like one could forecast a month in advance and not be too far off. Here in Alaska, day-to-day is iffy and forecasters are always apologizing for completely missing the boat. Of course, we have things like mountains, glaciers and volcanoes wreaking havoc on weather patterns.
5 days.
5 days is the period where a change in the chaotic system of the weather doubles (i’m trying to recall the correct name for this period).
It also depends, of course, on what you mean by “accurate”. Predicting general crappy weather for a whole metropolitan area a week in advance is far different from predicting the exact movement of a particularly violent storm cell.
I don’t know much about general meteorology (like what you see on TV), but about a year ago I saw some work done on predicting the time/location of individual windsheers (useful around airports, etc.). As far as I can recall, their computer models made very accurate forcasts about 60 minutes out, and they were in the process of trying to extend that to 120 minutes.
While they’re not hounding civil servants until they commit suicide, The BBC are rather good at all things concerning the weather…
I’m a meteorologist and I have to say that 5 days is about right.
If you’re talking about a fairly low level of detail - like “Partly cloudy during the morning with isolated thunderstorms developing by late afternoon,” you might be able to go out a week with decent accuracy. Of course, the shorter the forecast period, the greater the accuracy in terms of timing and location.
When I was taking undergraduate met classes, a grad student was looking into weather patterns that were more predictable than average. Sometimes a high pressure center sits in one place like an 800 pound gorilla and all the other patterns have to flow around it. It’s easy to forecast sunny, hot and hazy weather under a High. I don’t know if he found any other examples or came up with anything that would help your average forecaster in the field.
Computer models of the weather continue to make gradual progress and we keep analyzing statistics to find out where the models are weak and how we might compensate. But we’re limited by coarse input data and by chaos theory.
What do you make of Bill Giles’ 90 day forecasts then? He was a well known UK weatherman, but I’ve never been too sure of the validity of his long range forecasts.
There’s no archive, so it’s not easy to check their accuracy.
I really can’t say how well he does. My impression is that his score would start out well for the one to two week period, then gradually drop to just above climatology (raw averages) at the end of three months.
He’s not making very radical claims - he says his system isn’t perfect and he “errs on the side of caution.” His approach of combining an ensemble of computer models with climatology makes sense. His time periods are listed by week although he often breaks it up into sections. Early, mid- or end of the week. It seems that he does this less at the end of the quarter, so in effect he increases his time step later in the period.
Well, I think he’s going about it the right way, but I couldn’t evaluate how well he does without coming up with a criteria for accuracy and checking his track record.