Sign In
Ask Question
History
Jason
5 August, 07:43
What was the role of America after WWI?
+4
Answers (
1
)
Zack Valenzuela
5 August, 07:59
0
Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.
Comment
Complaint
Link
Know the Answer?
Answer
Not Sure About the Answer?
Get an answer to your question ✅
“What was the role of America after WWI? ...”
in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers
You Might be Interested in
Which development most influenced the spread of globalization in the 20th century
Answers (1)
Describe the characteristics of aztec religion and then explain the role religion played in aztec society. support your answer with evidence from the course
Answers (1)
Overproduction and underconsumption factored into causing the great depression by what
Answers (1)
Which of the following things were invented during the '70s
Answers (1)
What was the crowning achievement of king louis xiv?
Answers (1)
New Questions in History
Would the country would better today if we had followed more of Washington's advice
Answers (1)
Which of the following best describes how the English viewed Native American ties to the land?
Answers (1)
How did the columbian exchange affect the americas and europe?
Answers (1)
How did columbian exchange affected some native peoples
Answers (1)
Which was not a progressive movement reform
Answers (1)
Home
»
History
» What was the role of America after WWI?
Sign In
Sign Up
Forgot Password?