Definition of Namibia noun from the Oxford Advanced American Dictionary

 

Namibia

 noun
noun
NAmE//nəˈmɪbiə//
 
[singular]
 
jump to other results
a country in southern Africa
See the Oxford Advanced Learner's Dictionary entry: Namibia