TOP

Definition of the West Coast noun from the Oxford Advanced American Dictionary

 

the West Coast

 noun
noun
 
[singular]
 
jump to other results
the states on the west coast of the U.S., especially California
See the West Coast in the Oxford Advanced Learner's Dictionary