Q. What is the purpose of batch normalization in neural networks?

A
Reduce model size
B
Normalize layer inputs to stabilize training
C
Increase overfitting
D
Speed up inference
Solution:

Batch normalization normalizes the input to each layer, improving training stability and speed.

Entri PDF Icon

Get Question Bank

Strengthen Your Practice with our comprehensive question bank.

Entri Contact Image

Get Expert Advice for Free: Register for Your Free Consultation Now!