Ask Question
25 December, 04:44

How did life change for women in the United States after World War I started?

+3
Answers (2)
  1. 25 December, 05:53
    0
    Because many men were sent off to war, women took the men's positions in the workplace and so after the war, women had more non-traditional roles at work
  2. 25 December, 06:29
    0
    It changed the lives of women a lot. During the War, most men were drafted into the war, which opened up a ton of jobs for women. Women were able to now volunteer/work for the Marines, Army, and Navy and take jobs left behind by men. So, their lives changed because the War gave them jobs and other oppurtuinites.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did life change for women in the United States after World War I started? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers