TOP

Definition of the Wild West noun from the Oxford Advanced Learner's Dictionary

the Wild West

noun
 
/ðə ˌwaɪld ˈwest/
 
/ðə ˌwaɪld ˈwest/
[singular]
jump to other results
  1. the western states of the US during the years when the first Europeans were settling there, used especially when you are referring to the fact that there was not much respect for the law there
    CultureThis is the period shown in western films, though the picture they present of the Wild West is not often very accurate. Towns that were known for their outlaws (= criminals) and violence included Tombstone, Arizona, and Dodge City, Kansas. Famous outlaws included Jesse James and his brother Frank, Billy the Kid and the Younger brothers.compare the Old West
See the Wild West in the Oxford Advanced American Dictionary
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day