the Old West
/ði ˌəʊld ˈwest/
/ði ˌəʊld ˈwest/
- a phrase used to refer to the western parts of America in the 19th century when white people first settled there. compare Wild West
Check pronunciation:
the Old West
Definition of the Old West from the Oxford Advanced Learner's Dictionary