American English

Definition of France noun from the Oxford Advanced American Dictionary

 

France

 noun
noun
NAmE//fræns//
 
[singular]
 
jump to other results
a country in western Europe see also French
See the Oxford Advanced Learner's Dictionary entry: France