Ask Question
23 November, 03:43

How did World War 1 change women's roles in the United States

+1
Answers (2)
  1. 23 November, 06:14
    0
    Gave them more jobs and a chance to not just be a "stay at home mom" they got to work in factories (they took over the "mens" jobs
  2. 23 November, 06:54
    0
    World War one changed women's roles by giving them the ability to get other jobs other than working at home. Women were able to not only gain some of he rights as a man did but they were able to work if they wanted to. Typically women were expected to stay at home and take care of household things such as cooking, cleaning and taking care of the kid (s) but the war opened opportunities for women to get jobs and do things outside of the typical household.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did World War 1 change women's roles in the United States ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers