I don’t really want this to turn into a debate, but I have a feeling that this will end up here nonetheless so I’ll save the Mods some time.
Anyway, I got in a discussion with some friends about the Abrahamic religions ;j and which is the most “pro-western” in outlook. I don’t have nearly enough training in any of the three to adequately show that one moreso than the others is more “western.” Therefore, I turn to you, the minds of SDMB, which religion, scripturally , has the most western slant? I do not care about how any of the religions are actually practiced or what has been done in the past. If we look solely at the texts for each one, which one is the most in line with our society?
Of course, I understand that defining western will likely vary among us all, so actually defining what I’m asking will probably change according to each poster. It won’t be necessary to actually define what western means unless you feel that your actual definition of the word varies dramatically from what the popular consensus would be or it is important in your case. Western can be whatever you feel as long as their is some support for that within our society.
My gut feeling was that Christianity has to be the most supportive of our western lifestyles, but I don’t have textual support. I’m just using my intuition to make this claim. Western society has two major influences, Classical Greek thought and Biblical thought. Therefore, the way I am defining it, Christianity appears to be an inseperable part of what it means to be western. My friends aren’t convinced by my feats of logic.
Looks like I actually may have turned this into a semi-decent debate…