American English

Definition of Hollywood noun from the Oxford Advanced American Dictionary

 

Hollywood

 noun
noun
NAmE//ˈhɑliˌwʊd//
 
[uncountable]
 
jump to other results
the part of Los Angeles where the movie industry is based (used to refer to the U.S. movie industry and the way of life that is associated with it)
See the Oxford Advanced Learner's Dictionary entry: Hollywood