This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I know how awful that sounds. I think that BLM is the most important American movement of the century so far. I've marched and organized with BLM protesters and some of them are outright heroes. I've seen them put themselves in situations of great personal risk in standing up for their ideals. I've seen "white privilege" go from something you couldn't even mention without being branded a pinko-commie to something that actually is part of the public lexicon, and it's amazing. I actually have faith that in the next few years there are going to be major moves towards racial equality, and I'm able to watch and even participate in this massive shift unfolding. Thanks to the BLM movement I've actually got a positive outlook on one aspect of the future.
So here's why I'm just a little bit frustrated when I see politicians and pundits talking about (at long last) white privilege. Not just because a lot of the time they only seem to be paying lip service to it (although that too) but because even though we seem to finally be at a point as a society where we can have a public discussion about white privilege and what it means to be a POC in modern America, there are two other words that are equally insidious that you never seem to hear in the mainstream: male privilege.
Yes, you hear it in some ways in passing. Hillary Clinton has mentioned epidemic college rape statistics, and that's something. As awful as that is, and as necessary as it is to find a remedy, though, that's just a tiny symptom of the big picture. The experience that men and women have in the modern United States is entirely different, and have been forever. You hear pundits talking about poor conditions for women in the Middle East - and in many cases those conditions are terrible, and we really should be discussing them - but what about here at home in our "developed" Western countries? We all heard the horrifying number of rapes and sexual assaults in Germany recently - but only because the media tied them to immigrants and refugees. You know what wasn't the headline of any of the coverage of that story? That most of the violent criminals involved weren't scary immigrants, they were Germans. That this kind of thing happens all the time and doesn't even make the news because we're all so desensitized and used to it. How is this not a major, if not the major, issue in contemporary culture?
I hate even thinking this. I don't mean to detract from BLM or any other group or individual trying to make a more egalitarian society. I just feel like the systemic sexism in Western culture is overwhelming and completely underreported. I'm not even talking about the wage gap - that gets mentioned sometimes, but what doesn't get mentioned is that many women can't walk through their own town or city at night without being constantly on guard, or that they can't have a few drinks with someone they met recently without knowing, in the back of their mind, that they have to be careful because if something happened to them they might not even have legal recourse.
Maybe I'm asking for too much. Maybe in a matter of years this will be addressed as the major issue it is. Maybe there will be a solution, through education or a better legal system or something. It just feels frustrating and futile that male privilege and yes, rape culture, are so deeply engrained in our culture that most people won't even talk about them or acknowledge them.
Sorry for the long wall of text. I'm kind of frustrated.
Subreddit
Post Details
- Posted
- 8 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/offmychest/...