Ask Question
20 September, 05:58

how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply

+1
Answers (1)
  1. 20 September, 08:21
    0
    Society became more open, and women experienced greater freedom.

    Women began to seek out new careers.

    Women challenged old traditions by doing things such as changing their clothing style.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers