Ep. 27 - Big Algo, Fat Tails, and Converging Priors - podcast episode cover

Ep. 27 - Big Algo, Fat Tails, and Converging Priors

Aug 14, 201850 min
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

Today we dive into the current Bayesian flame wars on Twitter. Do Bayesian priors converge? As Nassim Taleb (@nntaleb) points out, not necessarily until a fat tail or power law distribution. We'll talk about what that means, and the wonders worked by Bayes rule even under some seemingly preposterous priors.

Also - the military wants to do machine learning with less data. Is the era of big data over and giving way to the era of the big algorithm? The results of the Twitter Shadow Ban poll, QA bias, the Streisand effect and the Alex Jones banning

Get full access to The Local Maximum at localmaximum.substack.com/subscribe
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
Ep. 27 - Big Algo, Fat Tails, and Converging Priors | The Local Maximum with Max Sklar podcast - Listen or read transcript on Metacast