I was just thinking that now that we’ve reached an…Age of Enlightenment, if you will, we (or many of us), have come to the conclusion that equality is something to be enforced at many costs. That is to say, humans are all equal, entitled to rights, etc. and race/religion/etc. has nothing to do with how we should be treated.
With that in mind, do gender roles still mean anything? Yes, on a practical level, men can’t have babies, and women generally don’t have the same physical strength as men, but I mean on a cultural level. In the past, women were considered the weaker sex, suited for raising children/nurturing, and men considered strong/practical, etc. Today, however, we’re less likely to stereotype- though there are still narrow minded people around.
When raising kids, is it better to differentiate based on sex, or to basically give them few if any restrictions on what they can do? Personally I feel that despite the fact that there are differences in the way men and women think/behave, that this shouldn’t hinder people’s roles. Assuming we’re not talking about things that are physically difficult or downright impossible for either gender (giving birth, for example), in an everyday setting, both genders should have equal opportunities.
To broaden the question, what does it mean to be a male/female now? I suppose if we were still living back in the day before medical advancements and technology, back in the so called state of nature, we’d revert back to our biological roles. But now that we’re beyond that, to what degree should we differentiate ourselves by sex? (To make things more simple, at least to start with, let’s assume we’re just talking about a more westernized culture, such as the U.S., in the present.)