Ask Question
15 June, 08:41

Hawaii became a US territory after?

American planters overthrew the Hawaiian government.

the United States seized the islands from Spain.

the United States bought the islands from the

royal family.

+4
Answers (1)
  1. 15 June, 09:18
    0
    The correct answer is A. American planters overthrew the Hawaiian government.

    Explanation:

    The territory of Hawaii was officially annexed to the U. S. in 1898; this was only possible because the previous government in Hawaii, which was a monarchy, was overthrown. This occurred in 1893 as planters on the island including mainly American planters and natives organized to end this type of government and make the Queen Liliuokalani leave the throne because they did not agree with the actions of the queen. After this monarchy ended, a new government began in Hawai and some years later the U. S. created a treaty to annex the territory, which was considered a strategic location for war. Thus, Hawaii became a US territory after American planters overthrew the Hawaiian government.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Hawaii became a US territory after? American planters overthrew the Hawaiian government. the United States seized the islands from Spain. ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers