Question 6 of 10Pro Only
What are vanishing and exploding gradients? Why do they occur in deep networks, and what techniques can you use to address them?
Sample answer preview
Vanishing and exploding gradients are problems that occur during backpropagation in deep neural networks. They make training difficult or impossible by causing gradients to become too small or too large as they propagate through layers.
vanishing gradientsexploding gradientsReLUbatch normalizationgradient clippingresidual connections