Ask Question
7 May, 13:07

By the 1960s women in american society

+1
Answers (1)
  1. 7 May, 15:08
    0
    In the 1960s, deep cultural changes were altering the role of women in American society. More females than ever were entering the paid workforce, and this increased the dissatisfaction among women regarding huge gender disparities in pay and advancement and sexual harassment at the workplace.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “By the 1960s women in american society ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers