Western

/ˈwɛstərn/

Meaning & Definition

noun
A film genre that portrays stories set in the American West, typically featuring cowboys, gunfights, and Native Americans.
He loves watching classic westerns starring John Wayne.
A cattle rancher or farmer in the Western United States.
The westerns of the region often gather for the annual rodeo.
adjective
Situated or lying toward or in the west.
The sun sets in the western sky.
Relating to, characteristic of, or situated in the western part of a region or country.
The western states of the USA are known for their diverse landscapes.
Associated with the cultural, political, or social values of Western countries, particularly those in Europe and North America.
She is interested in western literature and philosophy.

Etymology

Middle English, from West + -ern (as in eastern).

Common Phrases and Expressions

western civilization
The cultural and political heritage of the Western world.
western wear
Clothing style associated with the American cowboy culture.
western bloc
The group of Western countries allied during the Cold War.

Related Words

west
The direction opposite to east; a cardinal point.
westward
Toward the west.

Slang Meanings

A term often used informally to reference American cowboy culture.
He's got that real western vibe, like he's fresh off a ranch.
Referring to something that embodies Western ideals or lifestyles.
The new restaurant has a really western feel with its décor.