Ask Question
11 October, 05:12

How did World War I change the lives of American women

+4
Answers (1)
  1. 11 October, 06:06
    0
    World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities. The American public felt a great sense of nationalism and patriotism during the war as they were unified against a foreign threat.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did World War I change the lives of American women ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers