[MINI] The Vanishing Gradient - podcast episode cover

[MINI] The Vanishing Gradient

Jun 30, 201715 min
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.

For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
[MINI] The Vanishing Gradient | Data Skeptic podcast - Listen or read transcript on Metacast