I often hear that the treaty of Versailles was one of the main causes of WWII: had it not been so “harsh” against Germany, their economy wouldn’t have done so badly and the Nazis would not have come to power in a wave of anger.
For those of you who have studied this topic, how true is this narrative?
Would Hitler have risen to power no matter what the treaty contained? Was his ideology “we are Germans, we are superior, and we must rule the world” or was it “we are suffering under a terrible set of rules imposed by the treaty of Versailles and so we must wage war to free ourselves of the burden”?
If it’s the former, then can one say that WWII would have happened no matter what the treaty contained?
Also, on a related note: Italy and Japan did not have anything like the treaty of Versailles against them, and yet fascism still arose in Italy, and both countries still entered WWII against the whole world.
If fascism and an attempt at world domination can happen in Italy, and a superior-race-that-must-dominate attitude and an attempt at world domination can happen in Japan, all around the same time as these things were happening in Germany, and without anything like the treaty of Versailles against Italy or Japan, why couldn’t these have happened in Germany as well even if that treaty was never in place?
As a lay person, I find the near unanimous “understanding” that the treaty of Versailles was one of the main causes which led to WWII to be somewhat overselling the link between the treaty and its harsh conditions on Germany, and the rise of the Nazis and their attempt at world domination. They may have opportunistically used it to rile the crowds, but in its absence they would have used other techniques/reasons, just like Italy and Japan did.
But, I’m not a history buff or a history professional, so I’d like to see what dopers who are familiar with this era of history have to say.