American West

From Oklahoma

Revision as of 22:43, 5 January 2007 by 65.255.77.24 (Talk)
(diff) ←Older revision | view current revision (diff) | Newer revision→ (diff)

The Western United States, also referred to as the American West or simply The West, traditionally refers to the region comprising the westernmost states of the United States (see geographical terminology section for further discussion of these terms). Since the United States has expanded westward since its founding, the definition of the West has evolved over time.

The "West" had played an important part in American history; the Old West is embedded in America's folklore.

Personal tools