Q. What is the role of the ReLU activation function in CNNs?

A
To increase model complexity
B
To introduce non-linearity and mitigate vanishing gradients
C
To compress the output
D
To perform image segmentation
Solution:

ReLU (Rectified Linear Unit) adds non-linearity and helps gradient flow during backpropagation.

Entri PDF Icon

Get Question Bank

Strengthen Your Practice with our comprehensive question bank.

Entri Contact Image

Get Expert Advice for Free: Register for Your Free Consultation Now!