Explore our extensive collection of questions and answers to enhance your learning experience and prepare for exams effectively
Class Weighting assigns higher weights to the minority class to address imbalance in binary classification.
Momentum accelerates gradient descent by adding a fraction of the previous update to the current one.
Principal Component Analysis (PCA) reduces dimensionality while preserving linear relationships.
SHAP (SHapley Additive exPlanations) values provide insights into feature contributions, improving interpretability.
Backpropagation computes the gradient of the loss function with respect to each weight for optimization.
Collaborative Filtering analyzes user-item interactions to provide personalized recommendations.
Weights determine the importance of input features and are adjusted during training to minimize error.
Normalized Discounted Cumulative Gain (NDCG) evaluates the ranking quality of a model.
L2 regularization (Ridge) adds a penalty equal to the square of coefficients, encouraging smaller weights.
Hidden layers process and transform input data through weighted connections and activation functions.