Q. What is the purpose of the ReLU activation function?
Solution:
ReLU (Rectified Linear Unit) outputs the input directly if positive, otherwise zero, aiding gradient flow.
Get Question Bank
Strengthen Your Practice with our comprehensive question bank.