the Deep South
noun/ðə ˌdiːp ˈsaʊθ/
/ðə ˌdiːp ˈsaʊθ/
[singular]- the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South CarolinaCultureThey are among the states that once kept people as slaves and left the Union during the American civil war.
Want to learn more?
Find out which words work together and produce more natural sounding English with the Oxford Collocations Dictionary app.
Check pronunciation:
the Deep South