Top 350+ Solved Machine Learning (ML) MCQ Questions Answer

From 286 to 300 of 422

Q. Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results.

a. true – this works always, and these multiple perceptrons learn to classify even complex problems

b. false – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do

c. true – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded

d. false – just having a single perceptron is enough

  • c. true – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded

Q. Which one of the following is not a major strength of the neural network approach?

a. neural network learning algorithms are guaranteed to converge to an optimal solution

b. neural networks work well with datasets containing noisy data

c. neural networks can be used for both supervised learning and unsupervised clustering

d. neural networks can be used for applications that require a time element to be included in the data

  • a. neural network learning algorithms are guaranteed to converge to an optimal solution

Q. The network that involves backward links from output to the input and hidden layers is called

a. self organizing maps

b. perceptrons

c. recurrent neural network

d. multi layered perceptron

  • c. recurrent neural network

Q. In which neural net architecture, does weight sharing occur?

a. recurrent neural network

b. convolutional neural network

c. . fully connected neural network

d. both a and b

  • d. both a and b

Q.  Given above is a description of a neural network. When does a neural network model become a deep learning model?

a. when you add more hidden layers and increase depth of neural network

b. when there is higher dimensionality of data

c. when the problem is an image recognition problem

d. when there is lower dimensionality of data

  • a. when you add more hidden layers and increase depth of neural network

Q. The F-test

a. an omnibus test

b. considers the reduction in error when moving from the complete model to the reduced model

c. considers the reduction in error when moving from the reduced model to the complete model

d. can only be conceptualized as a reduction in error

  • c. considers the reduction in error when moving from the reduced model to the complete model
Subscribe Now

Get All Updates & News