The Walt Disney Company
/ðə ˌwɔːlt ˈdɪzni kʌmpəni/
/ðə ˌwɔːlt ˈdɪzni kʌmpəni/
(also Disney)
- a large US company started in 1923 by Walt Disney which is best known for its animated children's films. Today it owns a number of film companies including Walt Disney Pictures and Touchstone Pictures, produces toys and children's books, runs Disneyland theme parks in the US and other countries, and also owns a number of American television networks including ABC.
Definitions on the go
Look up any word in the dictionary offline, anytime, anywhere with the Oxford Advanced Learner’s Dictionary app.
Check pronunciation:
The Walt Disney Company