TOP

Definition of western noun from the Oxford Advanced Learner's Dictionary

western

noun
 
/ˈwestən/
 
/ˈwestərn/
jump to other results
  1. a film or book about life in the western US in the nineteenth century, usually involving cowboys
    CultureWesterns involve guns, horses and often Indians (Native Americans). They are popular because they represent the traditional struggle between good and bad, often in a simple but exciting way. Famous western films include High Noon and Shane. Western television series have included Gunsmoke and Bonanza.see also spaghetti western
    Topics Film and theatreb1
    Word OriginOld English westerne (from west and -ern).
See western in the Oxford Advanced American Dictionary
aspiration
noun
 
 
From the Word list
Oxford 5000
C1
Oxford Learner's Dictionaries Word of the Day