Ask Question
27 May, 07:14

How did world war 1 change women's roles in the united states? a). Women received greater educational opportunities b). Women fought alongside men in the military c). Women replaced men in the workforce d). Women earned more money than men

+5
Answers (1)
  1. 27 May, 10:44
    0
    c

    Explanation:

    im smart
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did world war 1 change women's roles in the united states? a). Women received greater educational opportunities b). Women fought ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers