Top 350+ Solved Machine Learning (ML) MCQ Questions Answer

From 256 to 270 of 422

Q. The K-means algorithm:

a. requires the dimension of the feature space to be no bigger than the number of samples

b. has the smallest value of the objective function when k = 1

c. minimizes the within class variance for a given number of clusters

d. converges to the global optimum if and only if the initial means are chosen as some of the samples themselves

  • c. minimizes the within class variance for a given number of clusters

Q. Hierarchical clustering is slower than non-hierarchical clustering?

a. true

b. false

c. depends on data

d. cannot say

  • a. true

Q. High entropy means that the partitions in classification are

a. pure

b. not pure

c. useful

d. useless

  • b. not pure

Q. The main disadvantage of maximum likelihood methods is that they are _____

a. mathematically less folded

b. mathematically less complex

c. mathematically less complex

d. computationally intense

  • d. computationally intense

Q. Which Statement is not true statement.

a. k-means clustering is a linear clustering algorithm.

b. k-means clustering aims to partition n observations into k clusters

c. k-nearest neighbor is same as k-means

d. k-means is sensitive to outlier

  • c. k-nearest neighbor is same as k-means

Q. what is Feature scaling done before applying K-Mean algorithm?

a. in distance calculation it will give the same weights for all features

b. you always get the same clusters. if you use or don\t use feature scaling

c. in manhattan distance it is an important step but in euclidian it is not

d. none of these

  • a. in distance calculation it will give the same weights for all features

Q. With Bayes theorem the probability of hypothesis H¾ specified by P(H) ¾ is referred to as

a. a conditional probability

b. an a priori probability

c. a bidirectional probability

d. a posterior probability

  • b. an a priori probability

Q. What is the naïve assumption in a Naïve Bayes Classifier.

a. all the classes are independent of each other

b. all the features of a class are independent of each other

c. the most probable feature for a class is the most important feature to be cinsidered for classification

d. all the features of a class are conditionally dependent on each other

  • d. all the features of a class are conditionally dependent on each other
Subscribe Now

Get All Updates & News