Ask Question
27 June, 18:33

What social changes took place in the United States after World War II? What role did the war play in those changes?

+1
Answers (1)
  1. 27 June, 20:13
    0
    There were two marking social changes that took place in the United States after the World War II, and as a result of the same. One of them was that the women started to get much more opportunities in life, especially in the working field, which resulted in bigger economic independence of the women, and improvement of their rights. The other one was that the people of other races, especially the African Americans, also gained they rights, as well as much more opportunities in life, as with the women, especially in the working field. The main reason why this happened was that lot of the men were sent to war, and led to big shortage of labor force. In order for the economy to continue to run and grow, the owners of the companies started to employ the people they had available, and in abundance, and that were the women and the African Americans.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What social changes took place in the United States after World War II? What role did the war play in those changes? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers