::Western United States


States::united    American::western    States::align    Nevada::colorado    Arizona::coast    Pacific::state

{{#invoke:redirect hatnote|redirect}} {{#invoke:redirect hatnote|redirect}}

CitationClass=web }}</ref>

The Western United States, commonly referred to as the American West or simply the West, traditionally refers to the region comprising the westernmost states of the United States. Because European settlement in the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier.

Since then, the frontier generally moved westward and eventually lands west of the Mississippi River came to be referred to as the West. Though no consensus exists, even among experts, for the definition of the West as a region, this article adopts the U.S. Census Bureau's definition of the 13 westernmost states which include the Rocky Mountains and the Great Basin to the West Coast, and the outlying states of Hawaii and Alaska.

The West contains several major biomes. It is known for arid to semi-arid plateaus and plains, particularly in the American Southwest - forested mountains, including the major ranges of the American Sierra Nevada and Rocky Mountains - the massive coastal shoreline of the American Pacific Coast - and the temperate rainforests of the Pacific Northwest.

Western United States sections
Intro  Defining the West  Demographics  Natural geography  [[Western_United_States?section={{safesubst:#invoke:anchor|main}}History|{{safesubst:#invoke:anchor|main}}History]]  Culture  Major metropolitan areas  Politics  Health  See also  References  Further reading  External links  

PREVIOUS: IntroNEXT: Defining the West