American English

Definition of naturalism noun from the Oxford Advanced American Dictionary

     

    naturalism

     noun
    noun
    NAmE//ˈnætʃrəˌlɪzəm//
     
    , NAmE//ˈnætʃərəˌlɪzəm//
     
    [uncountable]
     
    jump to other results
  1. 1a style of art or writing that shows people, things, and experiences as they really are
  2. 2(philosophy) the theory that everything in the world and life is based on natural causes and laws, and not on spiritual or supernatural ones
See the Oxford Advanced Learner's Dictionary entry: naturalism