American English

Definition of (the) West Indies noun from the Oxford Advanced American Dictionary

 

(the) West Indies

 noun
noun
NAmE//ˌwɛst ˈɪndiz//
 
[singular]
 
jump to other results

Other results

All matches