This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Hey there everyone I'm not trying to start an argument or anything just curious. All of the anecdotal evidence in my life leads me to the conclusion that men's mental health isn't taking seriously at all. Obviously women's mental health is fucked up as well, but people's desire to fix it basically feels different to me. Like I've been trying to get help for my depression and anxiety for about 7 straight years now and my doctors don't listen to me. Most of the guys I know that have similar issues get talked down to and silenced as well. It's like the doctors honestly don't believe we have depression. You know how I know that, because the doctors have said that to my fucking face. I remember staying up all night with suicidal thoughts and had a doctors appointment the next day. My doctor said I think your just lazy and you aren't trying to get better. I broke down in her office and she finally believed me. Most of the guys have had similar experiences. Most of my female friends that have dealt with it seem to get more help and support from their doctors. It's really disheartening.
So do you think we are silenced as men? I'm sure women have similar problems with it, but man does it feel lopsided some days. Like I don't even want to talk to doctors anymore because it seems like a guarantee they will not believe me or listen to my concerns. Again not trying to say one is worse than the other, but it feels like there's a lack of sympathy and empathy towards men.
Am I just being crazy? Let me know if you have similar experiences for both sides. I'd like to know more from the women's side as well.
Subreddit
Post Details
- Posted
- 2 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/depression/...