Ask Question
7 June, 10:25

How did America get Florida?

+5
Answers (1)
  1. 7 June, 11:37
    0
    In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did America get Florida? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers