TOP

Definition of the Wild West noun from the Oxford Advanced American Dictionary

 

the Wild West

 noun
noun
 
[singular]
 
jump to other results
the western states of the U.S. in the late 19th century, used especially to refer to the fact that there was not much respect for the law there
See the Wild West in the Oxford Advanced Learner's Dictionary