Definition of the Wild West noun from the Oxford Advanced Learner's Dictionary

 

the Wild West

 noun
noun
BrE
 
; NAmE
 
[singular]
 
jump to other results
the western states of the US during the years when the first Europeans were settling there, used especially when you are referring to the fact that there was not much respect for the law there Culture This is the period shown in westerns, though the picture they present of the Wild West is not often very accurate. Towns that were known for their outlaws (= criminals) and violence included Tombstone, Arizona, and Dodge City, Kansas. Famous outlaws included Jesse and Frank James, Billy the Kid and the Younger brothers. compare Old West
See the Oxford Advanced American Dictionary entry: the Wild West