What happened to the centrists? It seems like on Capital Hill and on TV, they no longer exist, but is this just a case of the Washington and media elite being out of touch with the electorate?
Take me for example: I believe government has a potentially positive role in the economic lives of America, and I hold mostly socially liberal views (like abortion being legal, gay marriage), believe the government should act to protect the environment. However, I cannot, for the life of me, stand the social justice warriors or the people who oppose all criticism of Islam, who think gender/sex are different things and/or “just constructs,” etc. The people who think government should act to get rid of a sports team name like the Redskins (which IMO, would be the government validating the bullshit view that what happened with the aboriginals was a “genocide”.) Hell, government shouldn’t be involved with professional sports. I also cannot stand the idea that government seems to be taking up the mantle that there is some sexual assault crisis on college campuses, and that girls who get ultra drunk shouldn’t be responsible for their idiocy, yet that a nail polish that detects roofies is “sexist.” Also, while being gay isn’t a bad thing, I don’t think its some kind of good thing.
CJ reform is needed, and yes, some of the laws do hit black communities hard, such as crack/cocaine. But there has to be some level of responsibility on part of the blacks and their “leaders” like Sharpton and Jackson. Its wrong that they preach the idea that its OK to not respect the institution of the police. Yes, racist individual cops are wrong, but not the police institution as a concept.
It’s sad. It seems like if a movie like American Pie (a classic film) were released today, it would be labeled as sexist and shelved. Or that while Family Guy and South Park may still be popular, that their longevity is why they’re still allowed on the air (by the Hollywood gatekeepers). At least 20 or even 10 years ago, only the religious right was so vocal about their complaints about “offensiveness” in culture. Now its the far-left. What has happened to this country? Have W. and Obama, with their shameless pandering to the ideological parts of their constituencies, ruined America?
I see Sanders’ rise as a product of Obama’s far-left pandering. In 2008, he pandered to the SJW and Daily Kos crowd in the primary, and I’ll never forgive him for that, despite my agreement with him on a decent amount of legislation. I don’t like the divisive atmosphere he’s created, and that Sanders’ followers seem to like. I like the Clintons, especially for things like the Sister Souljah moment. Or that the Clintons stayed away from the OJ case. I believe Hillary, at heart, is a centrist; she’s only lurching left because of Sanders and the atmosphere Barack Hussein Obama has created, and she’d never commented on Trayvon Martin without Obama doing so.
Not that the GOP is any better. Their leading candidates want to get rid of people who do the jobs Americans won’t. I’m not a conservative because I think the idea that a fetus or embryo is alive is nuts, and while homosexuality isn’t something to be celebrated, its not the fault of the person who is, thus why I don’t see eye to eye with the demonization of them. Their economic policies are also a big problem for me, which is why I’d have a tough time voting for them.
Where are the centrists in American politics, who don’t see every issue thru the ideological lens of people online and in the media? Are the extant, are they a silent majority? We need them back, if they’re not.