TOP

Definition of the West Coast noun from the Oxford Advanced Learner's Dictionary

the West Coast

noun
 
/ðə ˌwest ˈkəʊst/
 
/ðə ˌwest ˈkəʊst/
[singular]
jump to other results
  1. the states on the west coast of the US, especially California
    CultureTo many people the West Coast suggests a place where the sun shines most of the time, where the people have a relaxed way of life and often invent or follow new fashions, particularly those involving physical fitness or psychology.
See the West Coast in the Oxford Advanced American Dictionary
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day