English

Definition of The Walt Disney Company from the Oxford Advanced Learner's Dictionary

 

The Walt Disney Company

 
BrE
 
; NAmE
 
(also Disney)
 
jump to other results
a large US company started in 1923 by Walt Disney which is best known for its animated children's films. Today it owns a number of film companies including Walt Disney Pictures and Touchstone Pictures, produces toys and children's books, runs Disneyland theme parks in the US and other countries, and also owns a number of American television networks including ABC.