Q. What is the role of the ReLU activation function in CNNs?
Solution:
ReLU (Rectified Linear Unit) adds non-linearity and helps gradient flow during backpropagation.
Get Question Bank
Strengthen Your Practice with our comprehensive question bank.
ReLU (Rectified Linear Unit) adds non-linearity and helps gradient flow during backpropagation.
Strengthen Your Practice with our comprehensive question bank.