OK, let me start by saying that I realize that every generation is firmly convinced that the next generation is going to hell in a hand basket (“Those darn kids with their long hair and their Rock & Roll music!”)
Let me also admit that yes, I am a bit of a prude as a result of a strict religious upbringing (I’m no longer religious, but I still happen to believe in the values I was taught even if I don’t accept the religious underpinnings for them). However, I did not grow up in a particularly “sheltered” environment, having grown up in a suburb of a major metropolis.
Having said that, I can’t help wondering when did everything start going wrong? And by “everything,” I suppose I really mean public displays of behavior that was previously considered “wrong” or “inappropriate” or just plain “taboo.” When did it become acceptable to walk on the streets with profanity written on your t-shirt or baseball hat? When did it become acceptable to use profanity on network TV (even words like “ass” and “bitch” were considered major taboos back when I was growing up in the 70s)? When did we start celebrating everything that is base and pathetic about the human condition, a la The Jerry Springer show and the current crop of “reality” TV shows? For that matter, when did it become acceptable to mention a competitor’s product by name in a commercial instead of simply referring to “brand X” or “another leading brand”?
I don’t think there was ever a time when the world was perfect, and I’m sure the modern world has a lot more to offer than the one in which I grew up. But I just don’t understand how and when things started to get so… crude. Not to mention rude. Just about every TV show these days has to revolve around sex it seems. Pop music is laced with profanity. “Fashion” has become borderline pornography.
I have heard numerous theories over the years, and have come up with a few myself. One theory I have heard is that it all started with Richard Nixon and the Watergate scandal (or, perhaps, a few years previously with the Vietnam War and LBJ). This theory states that before that period, people were generally happy to accept that the government was looking out for their best interests. After that period, however, people realized that the government was run by corrupt individuals who would and could lie to the American public to further their own interests. This realization, in turn, led to a general abandonment of many of the “public virtues” that were previously seen as necessary for citizens to live together in harmony.
Another theory states that it all began with the invention of the “pill” and the subsequent Sexual Revolution. Once women could have sex as often as they wanted with no consequences, the theory goes, there was no need for them to act “ladylike” any longer. At the same time, as sex entered the area of public discourse, advertisers began to grow more and more bold with regard to using sex to sell products. This, in turn, created a whole new generation of people who expected to see graphic sexual imagery all around them.
My own personal theory is that the 70’s and 80’s were a time when large portions of our society who were previously oppressed finally started to have some real personal freedoms. Women began to have equal rights (not just because of the pill, but also because of better education and job opportunities). Similarly, many of the minorities here in the U.S. (Black, Hispanic, etc.) finally started getting some respect and personal opportunities. This was partially the result of “Affirmative Action” programs, but also I think there was simply a growing realization that everybody deserved a chance to succeed regardless of their background or ethnicity. All these new freedoms are, of course, a wonderful thing, and I’m not trying to suggest that women or blacks or other minorities are responsible for the decline of Western Civilizations (just in case you thought I was going there). But I do think that the 70’s and 80’s were a time of heightened awareness of the whole concept of “personal freedom” and, while this meant that previously oppressed people could have new opportunities, it also meant that people were reluctant to be seen as restricting anybody’s freedom to do or say anything. If a woman should be able to work the same job as a man and earn the same money, or if a black person would be able to go to the same school as a white person and get the same education, all in the name of “personal freedom,” then how can I tell somebody they shouldn’t be allowed to wear whatever they want on a t-shirt or say whatever they want on TV?
I don’t know – maybe we’ve just traded one set of problems (oppression of women and minorities) for another set (promiscuity, public displays of rudeness and immorality, etc.), and maybe things really are better for everybody now than they were 30 years ago. It just doesn’t seem that way to me, however. Of course, maybe that’s because I’m a white male who was never oppressed back when the world seemed like such a pleasant place in which to live.
Any thoughts?
Regards,
Barry