Skip to main content
Tracks
Find Jobs
Pricing
Blog
Tracks
Find Jobs
Pricing
Blog
Log in
Get Started Free
Tracks
AI/ML Engineer
Deep Learning Basics
Question 2 of 10
Listen
What are activation functions, and why are they necessary in neural networks? Compare the most common activation functions: ReLU, sigmoid, and tanh.
Show Model Answer
Previous Question
Back to Track