American English

Definition of Western medicine noun from the Oxford Advanced American Dictionary

 

Western medicine

 noun
noun
 
[uncountable]
 
jump to other results
the type of medical treatment that is standard in Europe and N. America and that relies on scientific methods the drugs used in Western medicine