And not RaleighRally alone.
Then if all cultures suffer from those inherent tribal tendencies I’m not sure why you bring it up in relation to Western Culture. If anything, the West aims to ensure equality before the law & institutional fairness so tribal biases are minimised.
:rolleyes: I didn’t, YOU did. I was talking to Qin Shi Huangdi about his claim to be a “Western Cultural supremacist”. Not about other cultures.
You realize this could sound rather xenophobic to a lot of people, right?
Unfortunately, the evidence prevents me from sharing your certitude.
Right, but it hardly seems that on that count you could say that Western Culture is worse than other cultures. As I said, if anything, it leads the way in reducing those biases.
And if you look at things like rights for women the West is hard to beat.
Nor did I say that , either.
Well, it seems to me that in terms of most liberal values Western culture leads the world. So I’m not sure what your beef is.
Der_Trihs:
:rolleyes: Really.
Yes, whether you look at women’s rights, gay rights, or the decline of violence, the West leads the way - doesn’t it?
But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler…
Yet, despite these caveats, a picture is taking shape. The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years. It applies over several orders of magnitude of violence, from genocide to war to rioting to homicide to the treatment of children and animals. And it appears to be a worldwide trend, though not a homogeneous one. The leading edge has been in Western societies, especially England and Holland, and there seems to have been a tipping point at the onset of the Age of Reason in the early seventeenth century.
Chen019:
Yes, whether you look at women’s rights, gay rights, or the decline of violence, the West leads the way - doesn’t it?
Stop ignoring what I’m actually saying just so you can push your xenophobic hobbyhorse.
Well, you don’t seem to disagree that Western culture is superior in terms of liberal values?
Actually, I’d love to hear your answer to Chen’s question. You constantly spout hate and disgust regarding the west, here’s your chance to enlighten for a change.
I’m not interested in enabling your xenophobic fantasies any more than I am in enabling Chen019 ’s.
I take it then that you concede the point - in terms of liberal values Western culture is superior to other cultures.
Does “Western culture” include Latin American?
More distortions. I’m not “conceding” anything about anything, I’m refusing to play along with you.
Political correctness was not invented in Sweden. It was indirectly imposed on Sweden and all other countries of the western world by their almighty leader, USA. Political correctness is a very clever construct. If you oppose it, you will be stigmatized in the classic Marxist way, well known to political dissidents of former Soviet Union. I will give you an example:
What the fuck is that even supposed to mean? America is leading the Marxist Revolution now??! Jesus Christ.
Yes, it does according to wikipedia.
Since the Renaissance, the West evolved beyond the influence of the ancient Greeks and Romans and the Islamic world due to the Commercial,[9] Scientific,[10] and Industrial Revolutions,[11] and the expansion of the Christian peoples of Western European empires, and particularly the globe-spanning empires of the eighteenth and nineteenth centuries. Since the Age of Discovery and Columbus, the notion of the West expanded to include the Americas, though much of the Americas have considerable pre-Western cultural influence. Australia, New Zealand and all countries of Latin America and the West Indies are considered part of Western culture due to their former status as settler colonies of Western Christian nations. Although many parts of Latin America and the West Indies are a blend of European, African and indigenous cultural influence, the same argument can be made to the United States.
Generally speaking, the current consensus would locate the West, at the very least, in the cultures and peoples of Western Europe, Australia, New Zealand and entire Western Hemisphere…
The term has come to apply to countries whose history is strongly marked by European immigration or settlement, such as the Americas, and Australasia, and is not restricted to Europe.
The Western world, also known as the West, primarily refers to various nations and states in Western Europe,[a] Northern America, and Australasia;[b] with some debate as to whether those in Eastern Europe and Latin America[c] also constitute the West. The Western world likewise is called the Occident (from Latin occidens 'setting down, sunset, west') in contrast to the Eastern world known as the Orient (from Latin oriens 'origin, sunrise, east'). Definitions of the "Western world" vary ac
That’s fine, you don’t want to concede the point but you can’t argue it either.
Really? Would you apply the same standard to the United States? Anything that Barack Obama says represents what the entire country thinks, no questions asked?
And why don’t you answer the question that Jonathan Chance asked in post #3 ?