Q. What is the purpose of the attention mechanism in transformers?

A
Reduce model size
B
Focus on relevant parts of input data
C
Increase training speed
D
Normalize outputs
Solution:

The attention mechanism allows transformers to focus on relevant parts of the input data for better predictions.

Entri PDF Icon

Get Question Bank

Strengthen Your Practice with our comprehensive question bank.

Entri Contact Image

Get Expert Advice for Free: Register for Your Free Consultation Now!