the Far West
/ðə ˌfɑː ˈwest/
/ðə ˌfɑːr ˈwest/
- the most western states of the US. This usually means those on the Pacific Ocean (California, Oregon and Washington), but some Americans say the Far West begins with the Rocky Mountain States.
Definitions on the go
Look up any word in the dictionary offline, anytime, anywhere with the Oxford Advanced Learner’s Dictionary app.
Check pronunciation:
the Far West