American English

Definition of the Deep South noun from the Oxford Advanced American Dictionary

 

the Deep South

 noun
noun
 
[singular]
 
jump to other results
the southern states of the U.S., especially Georgia, Alabama, Mississippi, Louisiana, and South Carolina
See the Oxford Advanced Learner's Dictionary entry: the Deep South