TOP

Definition of the Far West from the Oxford Advanced Learner's Dictionary

the Far West

 
/ðə ˌfɑː ˈwest/
 
/ðə ˌfɑːr ˈwest/
jump to other results
  1. the most western states of the US. This usually means those on the Pacific Ocean (California, Oregon and Washington), but some Americans say the Far West begins with the Rocky Mountain States.
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day