the Deep South
noun/ðə ˌdiːp ˈsaʊθ/
/ðə ˌdiːp ˈsaʊθ/
[singular]- the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South Carolina
Check pronunciation:
the Deep South
Definition of the Deep South noun from the Oxford Advanced Learner's Dictionary