the Old West
/ði ˌəʊld ˈwest/
/ði ˌəʊld ˈwest/
- a phrase used to refer to the western parts of America in the 19th century when white people first settled there. compare Wild West
Definitions on the go
Look up any word in the dictionary offline, anytime, anywhere with the Oxford Advanced Learner’s Dictionary app.
Check pronunciation:
the Old West