Ask Question
29 April, 07:10

What happened when the First World War ended?

+5
Answers (2)
  1. 29 April, 08:49
    0
    They stopped fighting while terms of peace were negotiated
  2. 29 April, 10:10
    0
    Nations that gained or regained territory or independence after World War I. France: gained Alsace-Lorraine as well as various African colonies from the German Empire, and Middle East territories from the Ottoman Empire. The African and Middle East gains were officially League of Nations Mandates.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What happened when the First World War ended? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers