Ask Question
18 November, 05:44

Which event finally brought the United States into World War II?

a. Japan's attack on Pearl Harbor

b. Germany's invasion of France

c. Britain's attack on Gibraltar

d. Italy's invasion of Greece

+4
Answers (1)
  1. 18 November, 06:34
    0
    It was "a. Japan's attack on Pearl Harbor" that finally brought the United States into World War II, since this was a direct attack on a United States military establishment, which was an indisputable act of war.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Which event finally brought the United States into World War II? a. Japan's attack on Pearl Harbor b. Germany's invasion of France c. ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers