English

Definition of the Far West from the Oxford Advanced Learner's Dictionary

 

the Far West

 
BrE
 
; NAmE
 
 
jump to other results
the most western states of the US. This usually means those on the Pacific Ocean (California, Oregon and Washington), but some Americans say the Far West begins with the Rocky Mountain States.