Is the world less racist because of Hitler?

I’ve been having this thought for a while. Maybe Hitler and the Nazi movement actually made the world less racist. Before WWII pretty much everyone was a racist and racism was the norm (considered science even). From one perspective, the nazis simply took that to its logical extreme. What made then unique seems to be mainly three factors:

  1. They did it in Europe whereas other nations where usually committing their genocides in Africa, Asia or America.

  2. They used very modernized and industrialized methods. Logical and efficient, but not as romantically appealing as using machine guns to mow down negroes.

  3. They did it to people who look white (gasp! They look… almost… human).

So basically they took racism and showed us what the end result would be if we apply this idea logically to its full extent. And we were repulsed, and now connected racism (which was all good and normal up until then) with nazism (which was obviously the baddest thing ever).

So… could you make the argument that in the end, the result of Hitler was less racism rather than more?

I don’t think so, no. America in the 60s and 70s was still pretty damn racist, as were France, the UK…
British comedian Stewart Lee, on the subject of political correctness, often relates the story of how, when he was 5 or 6, the Tories campaigned in his grandfather’s neighbourhood with flyers that said “If you want a nigger for a neighbour, vote Labour”. He was born in '68.

The horror of Hitler helped reduced racism against Jews in Europe & the Americas and WWII helped start the US at least down the road towards equal rights for African Americans so it did oddly do some good in these areas, though I would say while things are much better the racism is not gone.

Before the war France was nearly as anti-Semitic as Germany, after the war both nations did much better but the US & UK were also very anti-Semitic just not as bad as the other two.

The Army’s integration on Truman’s orders was set in motion by the events of WWII and would have come much later without WWII. Even baseball’s integration was almost certainly a byproduct of WWII to some degree and it was a watershed moment for the US at least.

This message board needs a Hitler forum.

You can argue almost anything.

You can certainly argue that World War II interrupted a trend towards less racism already under way.

Note that lynching in the US greatly declined during the 1930’s:

http://law2.umkc.edu/faculty/projects/ftrials/shipp/lynchingyear.html

You could argue that the coming of World War II had something to do with the topic of my next link, but this June 1941 executive order didn’t have anything to do with the not-yet-begun effort to systematically kill all Europe’s Jews:

If he US hadn’t entered World War II, maybe our military would have been completely integrated in the early-mid 1940’s rather than later. It is impossible to know.

I’m just focusing here on the US, but I’m thinking that social liberalism had been on the upswing in many countries.

Did revulsion against Hitler’s crimes, in certain countries, discredit milder forms of antisemitism? Probably, but it’s impossible to say for sure, and the effect is probably small.

Someone is probably going to bring up Israel, so I’ll point out that the Balfour Declaration was 1917:

However