


Vanishing & Exploding Gradient Problem & Dying ReLU Problem
Buy Me a Coffee☕
*Memos:
- My post explains Overfitting and Underfitting.
- My post explains layers in PyTorch.
- My post explains activation functions in PyTorch.
- My post explains loss functions in PyTorch.
- My post explains optimizers in PyTorch.
Vanishing Gradient Problem:
- is during backpropagation, a gradient gets smaller and smaller or gets zero, multiplying small gradients together many times as going from output layer to input layer, then a model cannot be trained effectively.
- more easily occurs with more layers in a model.
- is easily caused by Sigmoid activation function which is Sigmoid() in PyTorch because it produces the small values whose ranges are 0<=x<=1, then they are multiplied many times, making a gradient smaller and smaller as going from output layer to input layer.
- occurs in:
- CNN(Convolutional Neural Network).
- RNN(Recurrent Neural Network) which is RNN() in PyTorch.
- doesn't easily occur in:
- LSTM(Long Short-Term Memory) which is LSTM() in PyTorch.
- GRU(Gated Recurrent Unit) which is GRU() in PyTorch.
- Resnet(Residual Neural Network) which is Resnet in PyTorch.
- Transformer which is Transformer() in PyTorch.
- etc.
- can be detected if:
- parameters significantly change at the layers near output layer whereas parameters slightly change or stay unchanged at the layers near input layer.
- The weights of the layers near input layer are close to 0 or become 0.
- convergence is slow or stopped.
- can be mitigated by:
- Batch Normalization layer which is BatchNorm1d(), BatchNorm2d() or BatchNorm3d() in PyTorch.
- Leaky ReLU activation function which is LeakyReLU() in PyTorch. *You can also use ReLU activation function which is ReLU() in PyTorch but it sometimes causes Dying ReLU Problem which I explain later.
- PReLU activation function which is PReLU() in PyTorch.
- ELU activation function which is ELU() in PyTorch.
- Gradient Clipping which is clip_grad_norm_() or clip_grad_value_() in PyTorch. *Gradient Clipping is the method to keep a gradient in a specified range.
Exploding Gradients Problem:
- is during backpropagation, a gradient gets bigger and bigger, multiplying bigger gradients together many times as going from output layer to input layer, then convergence gets impossible.
- more easily occurs with more layers in a model.
- occurs in:
- CNN.
- RNN.
- LSTM.
- GRU.
- doesn't easily occur in:
- Resnet.
- Transformer.
- etc.
- can be detected if:
- The weights of a model significantly increase.
- The weights of a model significantly increasing finally become NaN.
- convergence is fluctuating without finished.
- can be mitigated by:
- Batch Normalization layer.
- Gradient Clipping.
Dying ReLU Problem:
- is during backpropagation, once the nodes(neurons) with ReLU activation function recieve zero or negative input values, they always produce zero for any input values, finally, they are never recovered to produce any values except zero, then a model cannot be trained effectively.
- is also called Dead ReLU problem.
- more easily occurs with:
- higher learning rates.
- higher negative bias.
- can be detected if:
- convergence is slow or stopped.
- a loss function returns nan.
- can be mitigated by:
- lower learning rate.
- a positive bias.
- Leaky ReLU activation function.
- PReLU activation function.
- ELU activation function.
The above is the detailed content of Vanishing & Exploding Gradient Problem & Dying ReLU Problem. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Solution to permission issues when viewing Python version in Linux terminal When you try to view Python version in Linux terminal, enter python...

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...

When using Python's pandas library, how to copy whole columns between two DataFrames with different structures is a common problem. Suppose we have two Dats...

How does Uvicorn continuously listen for HTTP requests? Uvicorn is a lightweight web server based on ASGI. One of its core functions is to listen for HTTP requests and proceed...

Fastapi ...

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

Using python in Linux terminal...

Understanding the anti-crawling strategy of Investing.com Many people often try to crawl news data from Investing.com (https://cn.investing.com/news/latest-news)...
