Question 2 of 10

What are activation functions, and why are they necessary in neural networks? Compare the most common activation functions: ReLU, sigmoid, and tanh.