TOP

Definition of Western medicine noun from the Oxford Advanced Learner's Dictionary

Western medicine

noun
 
/ˌwestən ˈmedsn/,
 
/ˌwestən ˈmedɪsn/
 
/ˌwestərn ˈmedɪsn/
[uncountable]
jump to other results
  1. the type of medical treatment that is standard in Europe and North America and that relies on scientific methods
    • the drugs used in Western medicine
See Western medicine in the Oxford Advanced American Dictionary
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day