Ask Question
18 January, 19:30

How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. Use specific moments of history as examples. In your opinion, has greater gender equity improved American society? Consider how the changing role of women has changed the identity of American society in Americans' eyes and the world's.

(answer has to be at least a paragraph)

+3
Answers (1)
  1. 18 January, 20:39
    0
    In the early 1900's women almost had no rights at all, from not being able to vote to not even to have day jobs along with their husband. On August 18,1920 all of that changed due to congress passing the 19th amendment, which granted the right of women to vote. The entire country changed that day women everywhere were overjoyed. Most men did not agree with the amendment however, they felt as if they were the dominant gender and that women were not supposed to have those rights. Now women make up about fifty percent of the votes which in the U. S. is about 162 million women voting.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers