Ask Question
6 April, 08:17

What did "the west" mean to Americans in the 1800s.

+2
Answers (1)
  1. 6 April, 11:17
    0
    Area between the Appalachians and the Mississippi river as the western frontier
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “What did "the west" mean to Americans in the 1800s. ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers