By conservative I mean not only politically but more in terms of cultural and social attitudes toward sexuality, religion, and the like.
I get the impression, especially here, that in terms of acceptance of various features ranging from mild nudity on television to public breastfeeding to “underage” drinking; the United States is seen as hoplessly puritanical by the rest of the Western World; who seem to believe that they are long past hang ups over such trivial issues of modesty.
The way some people seem to portray it, next to the Netherlands or Sweden; the United States seems more akin to the Muslim world when it comes to certain aspects of public morality. Are there any other western countries which have these sorts of ‘culture wars’, on everything from Harry Potter to topless sunbathing?
By the way, I suppose the definition of “western country” is a debate unto itself…