American English

Definition of Japan noun from the Oxford Advanced American Dictionary

 

Japan

 noun
noun
NAmE//dʒəˈpæn//
 
[singular]
 
jump to other results
a country consisting of a group of islands in eastern Asia
See the Oxford Advanced Learner's Dictionary entry: Japan