This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I feel like this has come up a few times before, but I don't want to spend forever searching for it.
The general feeling of Americans is that WWII couldn't have been won without us. However, I have heard a few times that that feeling isn't shared by Europeans, who believe that, at least in Europe, that they had already begun turning the tides. Instead, the US simply sped up this process.
My friend believes that I am confusing this with WWI, but he too is American, so I'm not sure he has heard all perspectives as history is all about who is telling it.
Can someone help?
Subreddit
Post Details
- Posted
- 10 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/AskHistoria...