TOP

Definition of the Deep South noun from the Oxford Advanced Learner's Dictionary

the Deep South

noun
 
/ðə ˌdiːp ˈsaʊθ/
 
/ðə ˌdiːp ˈsaʊθ/
[singular]
jump to other results
  1. the southern states of the US, especially Georgia, Alabama, Mississippi, Louisiana and South Carolina
    CultureThey are among the states that once kept people as slaves and left the Union during the American civil war.
See the Deep South in the Oxford Advanced American Dictionary
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day