This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
In the past couple years, I've noticed a massive (and bizarre) shift against sandals and the human foot in general. Barely anybody our age in the US wears sandals.
People are always making jokes about "dogs", because calling a foot a foot isn't a thing anymore. They'll look at you askance, if not outright mock you, for wearing sandals in the heat of summer. I don't understand why---I've been living in the tropics, where sandals are perfectly acceptable.
I find closed shoes to be uncomfortable at best, especially when it's scorching hot outside. It's not a fetish thing. I don't understand the stigma, and am tired of feeling self-conscious for wearing what makes me happy.
Could anyone please explain? Thank you.
Women wear sandals all the time. At where I am from. So I am assuming this is a male asking about why is there a stigma against men wearing sandals. I am not sure where you are from, but I would say I have not seen a decline in men wearing sandals. I do think there is a believe or stereotype that all men’s feet are ugly — which is not true. I feel like sandals and being barefoot is very common place for Gen Z men where I am from. However I can see why those things wouldn’t be common in other areas and how men’s feet being stereotyped as ugly plays a role in shoe choice. Just keep your toenails clipped and wear the sandal. Who cares what everybody wears or what anybody thinks.
Subreddit
Post Details
- Posted
- 2 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/GenZ/commen...