Ask Question
8 December, 18:15

What happened to German territory in the east after WWI?

+1
Answers (1)
  1. 8 December, 22:09
    0
    Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany ... In the east, Poland received parts of West Prussia and Silesia from Germany.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What happened to German territory in the east after WWI? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers