the Deep South
noun/ðə ˌdiːp ˈsaʊθ/
/ðə ˌdiːp ˈsaʊθ/
[singular]- the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South CarolinaCultureThey are among the states that once kept people as slaves and left the Union during the American civil war.
Definitions on the go
Look up any word in the dictionary offline, anytime, anywhere with the Oxford Advanced Learner’s Dictionary app.
Check pronunciation:
the Deep South