Q. What is the purpose of the ReLU activation function?

A
Introduce exponential growth
B
Output zero for negative inputs and the input for positive
C
Normalize data
D
Reduce overfitting
Solution:

ReLU (Rectified Linear Unit) outputs the input directly if positive, otherwise zero, aiding gradient flow.

Entri PDF Icon

Get Question Bank

Strengthen Your Practice with our comprehensive question bank.

Entri Contact Image

Get Expert Advice for Free: Register for Your Free Consultation Now!