Just 'cause the thread is worth the mileage, I’m going to take the opposite point of view.
Asimov, although apparently fully prepared to make a laughing stock of himself in the artificial intelligence community with his “robotic laws,” wasn’t so dim a mathematician as to ignore statistics. As I recall, and I haven’t a fear that I won’t be corrected if I’m wrong, the Foundation books did not hypothesize that human behavior was, in its normal state, predictable. The idea was that if a “Foundation” (get it?) were to change the rules so that unexpected events were surpressed, that what was left would be … uh… well… expected.
The idea that big changes in the world order is somewhat predictable isn’t new. Historians have noted that societies go through various familiar cycles. Asimov was suggesting that “familiar” might be manipulated to “predictable”.
This “butterflies can change the world” stuff is real cute, and it helps a bunch of weathermen complacient about their professional overconfidence 45 years ago, to be happy. But, dealing with modern realities, human volition stomps the “butterfly effect” flat. It isn’t that random events can’t have a big effect, it’s just that the more we recognize them, the more they fit within our planning. If they’re planned for, they become predictable. Confronted with a chaotic situation? Avoid it. Stop it.