Artwork

Indhold leveret af Brian Carter. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Brian Carter eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Where'd My Gradient Go? It Vanished!

8:39
 
Del
 

Manage episode 446714678 series 3605861
Indhold leveret af Brian Carter. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Brian Carter eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

This video discusses the vanishing gradient problem, a significant challenge in training deep neural networks. The speaker explains how, as a neural network becomes deeper, gradients—measures of how changes in network parameters affect the loss function—can decrease exponentially, leading to a situation where early layers of the network are effectively frozen and unable to learn. This problem arises because common activation functions like the sigmoid function can produce very small derivatives, which compound during backpropagation. The video then explores solutions like using different activation functions (like ReLU) and architectural changes (like residual networks and LSTMs) to mitigate this issue.

Watch the video: https://www.youtube.com/watch?v=ncTHBi8a9uA&pp=ygUSdmFuaXNoaW5nIGdyYWRpZW50

  continue reading

58 episoder

Artwork
iconDel
 
Manage episode 446714678 series 3605861
Indhold leveret af Brian Carter. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Brian Carter eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

This video discusses the vanishing gradient problem, a significant challenge in training deep neural networks. The speaker explains how, as a neural network becomes deeper, gradients—measures of how changes in network parameters affect the loss function—can decrease exponentially, leading to a situation where early layers of the network are effectively frozen and unable to learn. This problem arises because common activation functions like the sigmoid function can produce very small derivatives, which compound during backpropagation. The video then explores solutions like using different activation functions (like ReLU) and architectural changes (like residual networks and LSTMs) to mitigate this issue.

Watch the video: https://www.youtube.com/watch?v=ncTHBi8a9uA&pp=ygUSdmFuaXNoaW5nIGdyYWRpZW50

  continue reading

58 episoder

همه قسمت ها

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning