Sign In
Ask Question
History
Jason
5 August, 07:43
What was the role of America after WWI?
+4
Answers (
1
)
Zack Valenzuela
5 August, 07:59
0
Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.
Comment
Complaint
Link
Know the Answer?
Answer
Not Sure About the Answer?
Get an answer to your question ✅
“What was the role of America after WWI? ...”
in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers
You Might be Interested in
Hospitals weren't in general use until what war
Answers (1)
Who was the 29 president
Answers (2)
Cause Efect the economic boom of the post-World War II era Which BEST completes the diagram? the creation of the Hartsfield Jackson International Airport the creation of the Georgia Department of Transportation the creation of the interstate highway
Answers (1)
What significant military advantage did the British have over the Americans? Leadership Sea power Air power
Answers (2)
How many branches of government in Virginia Plan and New Jersey plan?
Answers (1)
New Questions in History
Those who wanted the acceptance of the constitution were called question 6 options: abolitionists populists democratic federalists
Answers (1)
Match the explorer with his discovery. 1. Francisco Coronado the Pacific Ocean 2. Hernando de Soto the Mississippi River 3. Vasco Nunez de Balboa the Grand Canyon 4. Christopher Columbus Cuba
Answers (1)
One outcome of the situation referred to in the cartoon was that
Answers (1)
Which artistic work did Michelangelo contribute to the Renaissance?
Answers (1)
What was the global impact of the Great Depression
Answers (1)
Home
»
History
» What was the role of America after WWI?
Sign In
Sign Up
Forgot Password?