Ask Question
17 April, 05:35

Why did many Americans feel it was important for the United States to gain control of Florida?

+2
Answers (1)
  1. 17 April, 06:11
    0
    Britain and Spain ruled it plus slaveships were brought in.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Why did many Americans feel it was important for the United States to gain control of Florida? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers