Ask Question
11 February, 22:35

How did the definition of west change in the years 1800 to 1860?

+1
Answers (1)
  1. 12 February, 01:11
    0
    The "West" began as any land between the Appalachians and the Mississippi River. After the Louisiana Purchase in 1803, the West was now any of the new territory beyond the Mississippi and north of Spanish territory in the south (Mexico, Texas, etc). By the time of the Civil War, the United States' territory extended to the Pacific and most of the landmass was what we now recognize as the continental USA.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “How did the definition of west change in the years 1800 to 1860? ...” in 📙 History if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers