[MINI] The Vanishing Gradient - podcast episode cover

[MINI] The Vanishing Gradient

Jun 30, 201715 minTranscript available on Metacast
--:--
--:--
Listen in podcast apps:

Episode description

This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.