Book Appointment Now

Supervised Machine Learning: Regression and Classification Quiz Answers

Get All Weeks Supervised Machine Learning: Regression and Classification Quiz Answers

Supervised Machine Learning: Regression and Classification Week 01 Quiz Answers

Quiz 1: Supervised vs. unsupervised learning Quiz Answers

Q1. Which are the two common types of supervised learning? (Choose two)

[expand title=View Answer]
Regression

Classification
[/expand]

Q2. Which of these is a type of unsupervised learning?

[expand title=View Answer] Clustering[/expand]

Quiz 2:  Regression Quiz Answers

Q1. For linear regression, the model is f_{w,b}(x) = wx + bf

Which of the following are the inputs, or features, that are fed into the model and with which the model is expected to make a prediction?

[expand title=View Answer] xx[/expand]

Q2. For linear regression, if you find parameters ww and bb so that J(w,b)J(w,b) is very close to zero, what can you conclude?

[expand title=View Answer]The selected values of the parameters ww and bb cause the algorithm to fit the training set really well.[/expand]

Quiz 3: Train the model with gradient descent Quiz Answers

Q1. Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J. Supervised Machine Learning: Regression and Classification Quiz Answers

When \frac{\partial J(w,b)}{\partial w}∂wJ(w,b)​ is a negative number (less than zero), what happens to ww after one update step?

[expand title=View Answer] ww decreases [/expand]

Q2. For linear regression, what is the update step for parameter b?

[expand title=View Answer] b = b – \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (f_{w,b}(x^{(i)}) – y^{(i)})x^{(i)}b=b−αm1​i=1∑m​(fw,b​(x(i))−y(i))x(i) [/expand]

Week 2 Quiz Answers

Quiz 1: Multiple Linear Regression Quiz Answers

Q1. In the training set below, what is x_4^{(3)}x ? Please type in the number below (this is an integer such as 123, no decimal points).

[expand title=View Answer] 125 [/expand]

Q2. Which of the following are the potential benefits of vectorization? Please choose the best option.

[expand title=View Answer]
It makes your code run faster

It can make your code shorter

It allows your code to run more easily on parallel computing hardware
[/expand]

Q3. True/False? To make gradient descent converge about twice as fast, a technique that almost always works is to double the learning rate alpha alpha.

[expand title=View Answer] False [/expand]

Quiz 2: Gradient descent in practice Quiz Answers

Q1. Which of the following is a valid step used during feature scaling?

[expand title=View Answer] Subtract the mean (average) from each value and then divide by the (max – min). [/expand]

Q2. Suppose a friend ran gradient descent three separate times with three choices of the learning rate \alphaα and plotted the learning curves for each (cost J for each iteration).

For which case, A or B, was the learning rate \alphaα likely too large?

[expand title=View Answer] case B only[/expand]

Q3. Of the circumstances below, for which is feature scaling particularly helpful?

[expand title=View Answer] Feature scaling is helpful when one feature is much larger (or smaller) than another feature.[/expand]

Q4. You are helping a grocery store predict its revenue, and have data on its items sold per week and price per item. What could be a useful engineered feature?

[expand title=View Answer]For each product, calculate the number of items sold times the price per item. [/expand]

Q5. True/False? With polynomial regression, the predicted values f_w,b(x) do not necessarily have to be a straight line (or linear) function of the input feature x.

[expand title=View Answer] True [/expand]

Week 3 Quiz Answers

Quiz 1: Classification with Logistic Regression Quiz Answers

Q1. Which is an example of a classification task?

[expand title=View Answer]Based on the size of each tumor, determine if each tumor is malignant (cancerous) or not. [/expand]

Q2. Recall the sigmoid function is g(z) = \frac{1}{1+e^{-z}}g(z)=
1+e
−z

1 If z is a large positive number, then:

[expand title=View Answer] g(z)g(z) will be near zero (0) [/expand]

Q3. A cat photo classification model predicts 1 if it’s a cat, and 0 if it’s not a cat. For a particular photograph, the logistic regression model outputs g(z)g(z) (a number between 0 and 1). Which of these would be a reasonable criteria to decide whether to predict if it’s a cat?

[expand title=View Answer]Predict it is a cat if g(z) >= 0.5 [/expand]

Q4. True/False? No matter what features you use (including if you use polynomial features), the decision boundary learned by logistic regression will be a linear decision boundary.

[expand title=View Answer] True[/expand]

Quiz 2: Cost function for logistic regression Quiz Answers

Q1. In this lecture series, “cost” and “loss” have distinct meanings. Which one applies to a single training example?

[expand title=View Answer] Loss [/expand]

Quiz 3: Gradient descent for logistic regression

Q1. Which of the following two statements is a more accurate statement about gradient descent for logistic regression?

[expand title=View Answer] The update steps look like the update steps for linear regression, but the definition of f_{\vec{w},b}(\mathbf{x}^{(i)})fw,b​(x(i)) is different. [/expand]

Quiz 4: The problem of overfitting

Q1. Which of the following can address overfitting?

[expand title=View Answer] Remove a random set of training examples

Collect more training data

Select a subset of the more relevant features.

Apply regularization
[/expand]

Q2. You fit logistic regression with polynomial features to a dataset, and your model looks like this.

Supervised Machine Learning: Regression and Classification Quiz AnswersWhat would you conclude? (Pick one)

[expand title=View Answer] The model has a high variance (overfit). Thus, adding data is, by itself, unlikely to help much. [/expand]

Q3. Supervised Machine Learning: Regression and Classification Quiz Answers Suppose you have a regularized linear regression model.  If you increase the regularization parameter \lambdaλ, what do you expect to happen to the parameters w_1,w_2,…,w_nw1​,w2​,…,wn​?

[expand title=View Answer] This will increase the size of the parameters w_1,w_2,…, w_nw1​,w2​,…,wn​ [/expand]

Get All Course Quiz Answers of Machine Learning Specialization

Supervised Machine Learning: Regression and Classification Quiz Answers

Advanced Learning Algorithms Coursera Quiz Answers

Unsupervised Learning, Recommenders, Reinforcement Learning Quiz Answers

Share your love

Newsletter Updates

Enter your email address below and subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *