my disclaimer:
i’m curious about this, so i’d appreciate answers rather than defensive posters feeling that i’m attacking their views. i’m just wondering, that’s all…
my question:
aren’t conservative/right wing political viewpoints just ‘putting off the inevitable’? society, on the whole seems to be growing increasingly liberal, as can be seen in the femenist movement; the growing unacceptability of racism, homophobia; the legality of abortions; the freedom of religions found in (western) societies; the emancipation of slaves; etc. on the whole, there seems to be a shift towards more liberal attitudes in western society, and there seems to be no indication that this trend is going to reverse at any time in the near future.
so, it seems to me that conservatism is just hindering the natural progression of society, prohibiting the attitudes that shall undoubtedly flourish regardless.
am i missing something? anyone who’d describe themselves as conservative, please tell me.