American English

Definition of western noun from the Oxford Advanced American Dictionary

 

western

 noun
noun
NAmE//ˈwɛstərn//
 
 
jump to other results
a movie or book about life in the western U.S. in the 19th century, usually involving cowboys
See the Oxford Advanced Learner's Dictionary entry: western