American English

Definition of the West Indies noun from the Oxford Advanced American Dictionary

 

the West Indies

 noun
noun
NAmE//ˌwɛst ˈɪndiz//
 
[plural] (abbreviation WI)
 
jump to other results
a group of islands between the Caribbean and the Atlantic that includes the Antilles and the Bahamas
West Indian
 
jump to other results
 
adjective
West Indian
 
jump to other results
 
noun

Other results

All matches