Ask Question
13 July, 14:31

How did the United States end up at war with Germany in 1941? A. The United States declared war on Germany after the London blitz. B. After America came to Britain's aid with the "lend-lease" program, Germany declared war on the U. S. C. After the U. S. declared war on Japan, Germany declared war on the U. S. D. The U. S. declared war on Germany and Italy at the same time it declared war on Japan.

+5
Answers (2)
  1. 13 July, 15:10
    0
    C. After the U. S. declared war on Japan, Germany declared war on the U. S.
  2. 13 July, 18:20
    0
    C. After the U. S. declared war on Japan, Germany declared war on the U. S.

    Explanation:

    After the Japanese attack on Pearl Harbor and the United States' declaration of war against the Japanese Empire, Nazi Germany declared war against the United States, in response to what was claimed to be a series of provocations by the United States government.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did the United States end up at war with Germany in 1941? A. The United States declared war on Germany after the London blitz. B. After ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers