American English

Definition of fascism noun from the Oxford Advanced American Dictionary



(also Fascism) noun
jump to other results
an extreme right-wing political system or attitude that is in favor of strong central government and that does not allow any opposition
See the Oxford Advanced Learner's Dictionary entry: fascism