Question 6 of 10Pro Only

What are vanishing and exploding gradients? Why do they occur in deep networks, and what techniques can you use to address them?

Sample answer preview

Vanishing and exploding gradients are problems that occur during backpropagation in deep neural networks. They make training difficult or impossible by causing gradients to become too small or too large as they propagate through layers.

vanishing gradientsexploding gradientsReLUbatch normalizationgradient clippingresidual connections

Unlock the full answer

Get the complete model answer, key points, common pitfalls, and access to 9+ more AI/ML Engineer interview questions.

Upgrade to Pro

Starting at $19/month • Cancel anytime