Western medicine
noun/ˌwestən ˈmedsn/, /ˌwestən ˈmedɪsn/
/ˌwestərn ˈmedɪsn/
[uncountable]- the type of medical treatment that is standard in Europe and North America and that relies on scientific methods
- the drugs used in Western medicine
Check pronunciation:
Western medicine