That question. Is it a universal rule? Do you really have to multiply it by 2.326 to make it accurate 99% of the time? Am doing VAR calculations for treasury right now. :o
It is in a normal curve. This means: symmetrical, not too “peaked,” tails not too fat, etc. Not sure what you mean by accurate though.
Assuming normality may be a safe assumption or it may not. If the population is normal but you have a small sample size it may not appear normal. If not I’d use a different distribution.
2.32 is the z score for ~1% in the tail and 99% in the rest. If you have two tails, then the 99% is in the middle and each tail has 0.5%, so ~2.58, meaning 0.5% is greater and 0.5% is lesser than your range.
Meaning for daily Value-At-Risk calculations, the VAR calculated using the SD x 2.326 for the past 250 trading days should give you your highest potential loss in one trading day. Actual losses should not exceed your VAR estimate more than 3 times in a year, as a result. Thanks.
…assuming that stock market valuations form a normal distribution, which is almost certainly not true for, say, individual stocks, which are influenced by company events, and probably isn’t true for the market as a whole, either.
The normal distribution (aka gaussian) is ubiquitous because of the central limit theorem. If a random variable is the average of a number N of independent random variables, the distribution of values will approach a normal distribution as N gets larger and larger. There are some other conditions, but they are often met.
As others have already said, if a random variable is normally distributed, it will be within one standard deviation of the mean 68% of the time.
Government bonds, actually. Yes, statisticians say the historical SD method is obsolete. But it’s still the industry practice. GARCH, SETAR and all those other methods are beyond my skill at the moment. Need to read some more.