This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
This is pretty straight forward but curious what women's thoughts are in America on this. I've told women during a hookup but before the clothes all came off and have also not mentioned anything. A few have liked knowing in advance, others have found it a little too much but also there have definitely been a few women who straight up told me they weren't sure what to do with it and wish they had known. And of course somehow a few didn't even realize until after multiple hookups.
I know there isn't any one answer here but would you rather have someone tell you beforehand or find out on your own, especially if it was something new to you.
Subreddit
Post Details
- Posted
- 8 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/sex/comment...