American West

From Oklahoma

The Western United States, also referred to as the American West or simply The West, traditionally refers to the region comprising the westernmost states of the United States (see geographical terminology section for further discussion of these terms). Since the United States has expanded westward since its founding, the definition of the West has evolved over time.

The "West" had played an important part in American history; the Old West is embedded in America's folklore.

Personal tools