This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
This is a more historical question, but I was curious as to Germany's position in the international field, seeing as how it had previously been destroyed after WWII, but now it remains an internationally accepted global power with an incredible industrial capacity. What has Germany done to do away with stigmatisms and past scarred relations with nations such as France, the U.S., and any other nation in Europe? How has it freed itself of the Nazi image, and become well-accepted by every government? How has it achieved a position of being the leader of the Eurozone, and is Germany living up to that role? Or is it still hesitant to take up that mantle, as it will require much effort to maintain such a position, with problem countries like Greece or Spain?
In terms of a public image history, possible focuses could involve the nation's people, education, and culture. For the political/economical question, possible focuses could be German Foreign and domestic policy.
Post Details
- Posted
- 9 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/germany/com...