Germany and Italy declared war on the U.S. after the U.S. decalred war on thier ally, Japan. I heard it said on these boards that Germany was, in essence, required to do so, by treaty with Japan.
However, I’m beginning to wonder about some of this.
Firstly, Hitler showed us he wasn’t above breaking a treaty. He did, after all, violate his nonagression treaty with Stalin by invading the USSR.
That being said, was Japan really providing anything of import to Germany? Because if not, I can’t see why Germany would want to declare war on the U.S in December 1941. They would have had a much easier time subduing the UK without the U.S. in the war. What would have been the consequences of Hitler telling Japan to take the treaty and stuff it? It’s not like the Japanese were going to open up yet another front and attack Germany.
In the end, I can’t see what Hitler gained out of declaring war on the U.S. So, why’d he do it? Or did he really think he was going to beat the U.S.?
Zev Steinhardt