English

Definition of the West Coast noun from the Oxford Advanced Learner's Dictionary

 

the West Coast

 noun
noun
BrE
 
; NAmE
 
[singular]
 
jump to other results
the states on the west coast of the US, especially California Culture To many people the West Coast suggests a place that has sunny weather most of the time, where the people have a relaxed way of life and often invent or follow new fashions, particularly those involving physical fitness or psychology.
See the Oxford Advanced American Dictionary entry: the West Coast