24: Language and Entropy (Information Theory in Language) - podcast episode cover

24: Language and Entropy (Information Theory in Language)

Mar 07, 201845 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

Information theory was founded in 1948 by Claude Shannon, and is a way of both qualitatively and quantitatively describing the limits and processes involved in communication. Roughly speaking, when two entities communicate, they have a message, a medium, confusion, encoding, and decoding; and when two entities communicate, they transfer information between them. The amount of information that is possible to be transmitted can be increased or decreased by manipulating any of the aforementioned variables. One of the practical, and original, applications of information theory is to models of language. So what is entropy? How can we say language has it? And what structures within language with respect to information theory reveal deep insights about the nature of language itself?


--- 


This episode is sponsored by 

· Anchor: The easiest way to make a podcast.  https://anchor.fm/app


Support this podcast: https://anchor.fm/breakingmathpodcast/support

For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
24: Language and Entropy (Information Theory in Language) | Breaking Math Podcast - Listen or read transcript on Metacast