This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I know the basic and obvious answer is that they are strongly influenced by American politics, but it does strike me as odd that so many anti-western Muslims use words like "liberal" as a slur to namely mean people with socially progressive views.
For context: only in America are "liberals" associated with the left in any way. Outside our little bubble, liberal tends to mean classic liberals, things like limited regulation, pro-private property, and so on. In more academic situations, liberal normally means pro-capitalism, modern philosophy (Cartesian ego and all that), etc. (This is vastly oversimplified but hopefully enough to convey the point).
Of course, I have mostly only seen this on reddit, which might hint at the American bias obvious here, but why are people so proud of their Muslim heritage and tradition so entrenched in western frames of discourse thay they unironically rail against "liberals" like any other Western American conservative? They might as well put away their Qur'an and pick up a Bible, wallahi.
Do we not have a rich history to look towards? One of scholarship and study, where men and women can stand equal in study? Where Equality of mind is counterbalanced by the difference of bodies (in gender relations)? Where gender and sexuality at one time did not abide by repressive Western standards of Man strong, woman weak, and only those two?
People have spent so time fighting the West, that they have become the West.
Subreddit
Post Details
- Posted
- 2 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/progressive...