Ask Question
5 August, 07:43

What was the role of America after WWI?

+4
Answers (1)
  1. 5 August, 07:59
    0
    Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What was the role of America after WWI? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers