This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
We've all seen photos online of some woman in public lifting her dress and exposing herself or sitting in a way where has to know her genitals are showing but she doesn't care. Even if children see it who cares. Little 8 year old Jimmy tells his dad a woman flashed him and he'd most likely congratulate his son for seeing his first pussy. We all know how it'd be if Jimmy's sister was flashed by a man. He'd be hunted down.
Or that football player who bragged about losing his virginity at 8 and everyone thought that was awesome. I remember the radio announcer saying if that were a woman there be a man hunt for the guy who had sex with her.
I've even seen men brag about being raped, or as they call it having sex, as kids by women. You'd be hard pressed to find a woman bragging about that 30 year old who took advantage of her when she was 12.
Not to mention how the media glorifies rape or sex abuse just long as the female isn't th victim. That's My Boy for example. Had that movie been about a male teacher no one would have watched it.
Are men really just a bunch of horny pigs or is there some deeper reason to this phenomenon?
I'm not asking about laws, I'm asking about ethics and morals. If you want to argue laws have it but I won't reply.
Subreddit
Post Details
- Posted
- 3 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/NoStupidQue...