Definition of feminism noun from the Oxford Advanced American Dictionary

 

feminism

 noun
noun
NAmE//ˈfɛməˌnɪzəm//
 
[uncountable]
 
jump to other results
the belief and aim that women should have the same rights and opportunities as men; the struggle to achieve this aim
See the Oxford Advanced Learner's Dictionary entry: feminism