联系方式

您当前位置:首页 >> Web作业Web作业

日期:2024-12-10 06:43

Sample Quiz 2

CSCI-567 – Machine Learning

Fall 2024

1. Instruction tuning is often formulated as a supervised learning problem where the model is fine-tuned on input-output pairs. If yi is the expected output, xi is the input, and zi is the instruction, which of the following is the correct loss function?

a) L(θ) = − log Pθ(xi|yi, zi)

b) L(θ) = − log Pθ(yi|xi, zi)

c) L(θ) = − log Pθ(zi|xi, yi)

d) L(θ) = − log Pθ(zi|xi)

2. In Gaussian Mixture Models (GMMs), each component is a Gaussian distribution characterized by its mean and covariance. When fitting a GMM to data, what is the role of the covariance matrices, and how do they affect the shape of the clusters?

a) The covariance matrices determine the orientation and shape of each cluster, allowing for elliptical clusters.

b) The covariance matrices are always identity matrices, resulting in spherical clusters.

c) The covariance matrices affect only the size but not the orientation of the clusters.

d) The covariance matrices are used to normalize the data before clustering.

3. Which of the following best explains how boosting improves model performance?

a) By training all weak learners independently and averaging their outputs.

b) By using a single strong learner trained on the entire dataset.

c) By sequentially training weak learners where each learner focuses on correcting the errors of the previous ones.

d) By randomly selecting subsets of data and features to train each weak learner.

4. Decision tree: Which one of the following statements is false?

a) Decision tree and AdaBoost both maintain the explainability of the model.

b) Decision tree can be applied to both classification and regression tasks.

c) Decision tree works with real and categorical data.

d) Decision tree is a linear model.

5. The Naive Bayes classifier makes a strong assumption about the features in the dataset. Which of the following best describes this assumption?

a) The features are linearly dependent on each other.

b) The features are conditionally independent given the class label.

c) The features follow a uniform. distribution regardless of the class label.

d) The features have equal variance across all class labels.




版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:821613408 微信:horysk8 电子信箱:[email protected]
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:horysk8