Definition of the Deep South noun from the Oxford Advanced Learner's Dictionary

 

the Deep South

 noun
noun
BrE
 
; NAmE
 
[singular]
 
jump to other results
the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South Carolina Culture They are among the states that once had slaves and left the Union during the American Civil War. They still have racial problems and the people there are mostly conservative (= opposed to much change) in their politics and religion.
See the Oxford Advanced American Dictionary entry: the Deep South