American English

Definition of west noun from the Oxford Advanced American Dictionary

      

    west

     noun
    noun
    NAmE//wɛst//
     
    [uncountable, singular] (abbreviation W., W)
     
    jump to other results
  1. 1 (also the west) the direction that you look toward to see the sun go down; one of the four main points of the compass Which way is west? Rain is spreading from the west. He lives to the west of (= further west than) the town. compare east, north, south
  2. 2 the West the countries of North America and western Europe I was born in Japan, but I've lived in the West for some years now.
  3. 3 the West the western side of the U.S. the history of the American West see also the Midwest, the Wild West
  4. 4the West (in the past) Western Europe and N. America, when contrasted with the Communist countries of Eastern Europe East-West relations
See the Oxford Advanced Learner's Dictionary entry: west