the West Coast
noun/ðə ˌwest ˈkəʊst/
/ðə ˌwest ˈkəʊst/
[singular]- the states on the west coast of the US, especially CaliforniaCultureTo many people the West Coast suggests a place where the sun shines most of the time, where the people have a relaxed way of life and often invent or follow new fashions, particularly those involving physical fitness or psychology.
Want to learn more?
Find out which words work together and produce more natural sounding English with the Oxford Collocations Dictionary app.
Check pronunciation:
the West Coast