This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I know all about indecent exposure laws. But my question is “why are genitals actually considered indecent?” What is the shame behind it? At what point did we decide that that part of our body was considered inappropriate to show in public or to others? The whole concept just seems a little archaic and overly conservative to me. Why were arms not considered indecent? Why were legs not considered indecent? Why is it not indecent to show your face? (I mean I suppose these are in some cultures) But the U.S. is supposed to be new, forward thinking, worldly. So why such a stigma against nudity?
Subreddit
Post Details
- Posted
- 10 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/legaladvice...