Question 5 of 10Pro Only

Compare batch gradient descent, stochastic gradient descent, and mini-batch gradient descent. What are the tradeoffs of each approach?

Sample answer preview

Gradient descent is the optimization algorithm used to minimize the loss function by iteratively updating model parameters. The three main variants differ in how much data they use to compute each gradient update.

batch gradient descentstochastic gradient descentmini-batchbatch sizeconvergencememory

Unlock the full answer

Get the complete model answer, key points, common pitfalls, and access to 9+ more AI/ML Engineer interview questions.

Upgrade to Pro

Starting at $19/month • Cancel anytime