TOP

Definition of The Walt Disney Company from the Oxford Advanced Learner's Dictionary

The Walt Disney Company

 
/ðə ˌwɔːlt ˈdɪzni kʌmpəni/
 
/ðə ˌwɔːlt ˈdɪzni kʌmpəni/
(also Disney)
jump to other results
  1. a large US company started in 1923 by Walt Disney which is best known for its animated children's films. Today it owns a number of film companies including Walt Disney Pictures and Touchstone Pictures, produces toys and children's books, runs Disneyland theme parks in the US and other countries, and also owns a number of American television networks including ABC.
trait
noun
 
 
From the Word list
Oxford 5000
B2
Oxford Learner's Dictionaries Word of the Day