American English

Definition of dermatology noun from the Oxford Advanced American Dictionary

 

dermatology

 noun
noun
NAmE//ˌdərməˈtɑlədʒi//
 
[uncountable]
 
jump to other results
the scientific study of skin diseases
dermatological
 
jump to other results
NAmE//ˌdərmət̮lˈɑdʒɪkl//
 
adjective
See the Oxford Advanced Learner's Dictionary entry: dermatology