Q. Which neural network layer is used to introduce non-linearity?
Solution:
The Activation Layer, using functions like ReLU, introduces non-linearity to the model.
Get Question Bank
Strengthen Your Practice with our comprehensive question bank.
The Activation Layer, using functions like ReLU, introduces non-linearity to the model.
Strengthen Your Practice with our comprehensive question bank.