German nationalism

German nationalism is the idea that asserts that Germans are a nation and promotes the cultural unity of Germans. The earliest origins of German nationalism began with the birth of Romantic nationalism during the Napoleonic Wars when Pan-Germanism started to rise. Advocacy of a German nation began to become an important political force in response to the invasion of German territories by France under Napoleon. After the rise and fall of Nazi Germany during World War II, German nationalism has been generally viewed in the country as taboo. However, during the Cold War, German nationalism arose that supported the reunification of East and West Germany that was achieved in 1990.

German nationalism

German nationalism is the idea that asserts that Germans are a nation and promotes the cultural unity of Germans. The earliest origins of German nationalism began with the birth of Romantic nationalism during the Napoleonic Wars when Pan-Germanism started to rise. Advocacy of a German nation began to become an important political force in response to the invasion of German territories by France under Napoleon. After the rise and fall of Nazi Germany during World War II, German nationalism has been generally viewed in the country as taboo. However, during the Cold War, German nationalism arose that supported the reunification of East and West Germany that was achieved in 1990.