Definition of Germany noun from the Oxford Advanced American Dictionary

 

Germany

 noun
noun
NAmE//ˈdʒərməni//
 
[singular]
 
jump to other results
a country in central Europe
See the Oxford Advanced Learner's Dictionary entry: Germany