American English

Definition of the West Coast noun from the Oxford Advanced American Dictionary

 

the West Coast

 noun
noun
 
[singular]
 
jump to other results
the states on the west coast of the U.S., especially California
See the Oxford Advanced Learner's Dictionary entry: the West Coast