Having a hard time putting this in the right words but it seems that in many facets of American life, and especially politics, there is more of a “moral streak” here than in the rest of the West. Be it the political left or right, a great many things and debates center on whether something is morally right or wrong (“it’s immoral to support gay marriage” “It’s immoral NOT to support marriage equality” “It’s immoral to kill babies via abortion” “It’s immoral to separate kids from parents at the border”) Every side seems to couch their argument in moral terms and portray the opponents as immoral.
There is also much more intense public scrutiny over sexual issues (for instance, I think a European once commented that Bill Clinton’s flings with Lewinsky was an enormous scandal in America, especially among the right, but wouldn’t even raise eyebrows had he been a European politician) and more of a tendency for witch hunts.
The rest of the West, on the other hand, seems to focus more on what’s pragmatic or practical, and morality enters less into the equation. This isn’t to say that they don’t talk in moral terms - Europe talks a lot about the morality of helping refugees from Syria, etc. - but that there just seems to be much MORE about morality, morality, from American liberals and conservatives.
Is this a spillover effect of American Christianity into American politics and attitudes, even among American atheists? Or is it something else?