TOP

Definition of the West Indies noun from the Oxford Advanced American Dictionary

 

the West Indies

 noun
noun
NAmE//ˌwɛst ˈɪndiz//
 
[plural] (abbreviation WI)
 
jump to other results
a group of islands between the Caribbean and the Atlantic that includes the Antilles and the Bahamas
 
adjective
 
noun
See the West Indies in the Oxford Advanced Learner's Dictionary

Other results

All matches