the type of medical treatment that is standard in Europe and N. America and that relies on scientific methods the drugs used in Western medicine
Check pronunciation: Western medicine
Definition of Western medicine noun from the Oxford Advanced American Dictionary