LM101-086: Ch8: How to Learn the Probability of Infinitely Many Outcomes - podcast episode cover

LM101-086: Ch8: How to Learn the Probability of Infinitely Many Outcomes

Jul 20, 202135 minSeason 2Ep. 86
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

This 86th episode of Learning Machines 101 discusses the problem of assigning probabilities to a possibly infinite set of outcomes in a space-time continuum which characterizes our physical world. Such a set is called an “environmental event”. The machine learning algorithm uses information about the frequency of environmental events to support learning. If we want to study statistical machine learning, then we must be able to discuss how to represent and compute the probability of an environmental event. It is essential that we have methods for communicating probability concepts to other researchers, methods for calculating probabilities, and methods for calculating the expectation of specific environmental events. This episode discusses the challenges of assigning probabilities to events when we allow for the case of events comprised of an infinite number of outcomes. Along the way we introduce essential concepts for representing and computing probabilities using measure theory mathematical tools such as sigma fields, and the Radon-Nikodym probability density function. Near the end we also briefly discuss the intriguing Banach-Tarski paradox and how it motivates the development of some of these special mathematical tools. Check out: www.learningmachines101.com and www.statisticalmachinelearning.com for more information!!!

For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
LM101-086: Ch8: How to Learn the Probability of Infinitely Many Outcomes | Learning Machines 101 podcast - Listen or read transcript on Metacast