TOP

Definition of the New Right noun from the Oxford Advanced American Dictionary

 

the New Right

 noun
noun
 
[singular]
 
jump to other results
politicians and political groups that support conservative social and political policies and religious ideas based on Christian fundamentalism
See the New Right in the Oxford Advanced Learner's Dictionary