Ask Question
27 April, 18:16

What did the war do to the relationship between the american colonies and england?

+1
Answers (1)
  1. 27 April, 20:49
    0
    American colonies eventually become independent creating the United States of America and establishing their own laws.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What did the war do to the relationship between the american colonies and england? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers