So a while ago there was a story about catholic priests raping children and a few of those children came out decades later to tell people they were raped. I don’t think I saw a single right winged article about how it was all a fabrication to sully the reputation of religious leaders/members. Now there are countless examples of women coming out in the same fashion claiming they were raped too. But, then typically right winged people go out and state they don’t believe them. Asking, why didn’t she say something when it happen, why all of a sudden would she accuse someone of rape. She just wants fame, she just wants money, she just wants notoriety. None of these questions or statements are directed towards males victims that come out years later. It is automatically assumed that the women are lying and the men are telling the truth.
Most notably with the recent police state anti-4th amendment judge being nominated, a women came out and accused him of raping her when she was 15 while intoxicated. It immediately became a bumper sticker politics situation where people are throwing out accusations of being a liar and having malicious intentions to ruin his career because she’s a libtard feminist leftist. I’m not sure if this is more of a bias towards women or if it’s just tribalism. If a women accused Obama of raping her I’m pretty sure the right would be all for her, and they would believe her. So it could be either probably a little of both I’d imagine. Just wondering what yalls opinion would be on this, I never thought about it much but it seems like the modern lack of belief for female rape victims is a symptom of a rape culture.