This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Society makes such a big deal out of guys cocks. Guys act like it's the most important part of our body, and the law says it's illegal the moment someone sees it, media will happily show a fully naked women but avoid showing a cock...
For some reason I get really turned on acknowledging the truth. It's just an extra bit of flesh hanging off my body that just happens to feel nice to touch.
I enjoy hearing women tell me how my cock is nothing special, nothing particularly important. Hearing about how she wouldn't care if I was walking around naked, everything hanging out. Hearing how it doesn't mean anything if it gets hard, except for how much it amuses her. Hearing about how losing it wouldn't be anything I should worry about, because it's not like it's important, no more serious than neutering a dog. It's just a bit of extra flesh, spare meat.
This isn't about size humiliation, this is about the idea of men being disposable, easily replaced with a good vibrator.
Message if you're interesting in chatting a bit :)
Subreddit
Post Details
- Posted
- 4 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/dirtypenpal...