American English

Definition of social work noun from the Oxford Advanced American Dictionary

 

social work

 noun
noun
 
[uncountable]
 
jump to other results
paid work that involves giving help and advice to people living in the community who have financial or family problems
See the Oxford Advanced Learner's Dictionary entry: social work