Question 5 of 10Pro Only
Compare batch gradient descent, stochastic gradient descent, and mini-batch gradient descent. What are the tradeoffs of each approach?
Sample answer preview
Gradient descent is the optimization algorithm used to minimize the loss function by iteratively updating model parameters. The three main variants differ in how much data they use to compute each gradient update.
batch gradient descentstochastic gradient descentmini-batchbatch sizeconvergencememory