Supervised Machine Learning: Regression and Classification Quiz Answers

Get All Weeks Supervised Machine Learning: Regression and Classification Quiz Answers

Supervised Machine Learning: Regression and Classification Week 01 Quiz Answers

Quiz 1: Supervised vs. unsupervised learning Quiz Answers

Q1. Which are the two common types of supervised learning? (Choose two)



Q2. Which of these is a type of unsupervised learning?


Quiz 2:  Regression Quiz Answers

Q1. For linear regression, the model is f_{w,b}(x) = wx + bf

Which of the following are the inputs, or features, that are fed into the model and with which the model is expected to make a prediction?


Q2. For linear regression, if you find parameters ww and bb so that J(w,b)J(w,b) is very close to zero, what can you conclude?

The selected values of the parameters ww and bb cause the algorithm to fit the training set really well.

Quiz 3: Train the model with gradient descent Quiz Answers

Q1. Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J. Supervised Machine Learning: Regression and Classification Quiz Answers

When \frac{\partial J(w,b)}{\partial w}∂wJ(w,b)​ is a negative number (less than zero), what happens to ww after one update step?

ww decreases

Q2. For linear regression, what is the update step for parameter b?

b = b – \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (f_{w,b}(x^{(i)}) – y^{(i)})x^{(i)}b=b−αm1​i=1∑m​(fw,b​(x(i))−y(i))x(i)

Week 2 Quiz Answers

Quiz 1: Multiple Linear Regression Quiz Answers

Q1. In the training set below, what is x_4^{(3)}x ? Please type in the number below (this is an integer such as 123, no decimal points).


Q2. Which of the following are the potential benefits of vectorization? Please choose the best option.

It makes your code run faster

It can make your code shorter

It allows your code to run more easily on parallel computing hardware

Q3. True/False? To make gradient descent converge about twice as fast, a technique that almost always works is to double the learning rate alpha alpha.


Quiz 2: Gradient descent in practice Quiz Answers

Q1. Which of the following is a valid step used during feature scaling?

Subtract the mean (average) from each value and then divide by the (max – min).

Q2. Suppose a friend ran gradient descent three separate times with three choices of the learning rate \alphaα and plotted the learning curves for each (cost J for each iteration).

For which case, A or B, was the learning rate \alphaα likely too large?

case B only

Q3. Of the circumstances below, for which is feature scaling particularly helpful?

Feature scaling is helpful when one feature is much larger (or smaller) than another feature.

Q4. You are helping a grocery store predict its revenue, and have data on its items sold per week and price per item. What could be a useful engineered feature?

For each product, calculate the number of items sold times the price per item.

Q5. True/False? With polynomial regression, the predicted values f_w,b(x) do not necessarily have to be a straight line (or linear) function of the input feature x.


Week 3 Quiz Answers

Quiz 1: Classification with Logistic Regression Quiz Answers

Q1. Which is an example of a classification task?

Based on the size of each tumor, determine if each tumor is malignant (cancerous) or not.

Q2. Recall the sigmoid function is g(z) = \frac{1}{1+e^{-z}}g(z)=

1 If z is a large positive number, then:

g(z)g(z) will be near zero (0)

Q3. A cat photo classification model predicts 1 if it’s a cat, and 0 if it’s not a cat. For a particular photograph, the logistic regression model outputs g(z)g(z) (a number between 0 and 1). Which of these would be a reasonable criteria to decide whether to predict if it’s a cat?

Predict it is a cat if g(z) >= 0.5

Q4. True/False? No matter what features you use (including if you use polynomial features), the decision boundary learned by logistic regression will be a linear decision boundary.


Quiz 2: Cost function for logistic regression Quiz Answers

Q1. In this lecture series, “cost” and “loss” have distinct meanings. Which one applies to a single training example?


Quiz 3: Gradient descent for logistic regression

Q1. Which of the following two statements is a more accurate statement about gradient descent for logistic regression?

The update steps look like the update steps for linear regression, but the definition of f_{\vec{w},b}(\mathbf{x}^{(i)})fw,b​(x(i)) is different.

Quiz 4: The problem of overfitting

Q1. Which of the following can address overfitting?

Remove a random set of training examples

Collect more training data

Select a subset of the more relevant features.

Apply regularization

Q2. You fit logistic regression with polynomial features to a dataset, and your model looks like this.

Supervised Machine Learning: Regression and Classification Quiz AnswersWhat would you conclude? (Pick one)

The model has a high variance (overfit). Thus, adding data is, by itself, unlikely to help much.

Q3. Supervised Machine Learning: Regression and Classification Quiz Answers Suppose you have a regularized linear regression model.  If you increase the regularization parameter \lambdaλ, what do you expect to happen to the parameters w_1,w_2,…,w_nw1​,w2​,…,wn​?

This will increase the size of the parameters w_1,w_2,…, w_nw1​,w2​,…,wn​

Get All Course Quiz Answers of Machine Learning Specialization

Supervised Machine Learning: Regression and Classification Quiz Answers

Advanced Learning Algorithms Coursera Quiz Answers

Unsupervised Learning, Recommenders, Reinforcement Learning Quiz Answers

Team Networking Funda
Team Networking Funda

We are Team Networking Funda, a group of passionate authors and networking enthusiasts committed to sharing our expertise and experiences in networking and team building. With backgrounds in Data Science, Information Technology, Health, and Business Marketing, we bring diverse perspectives and insights to help you navigate the challenges and opportunities of professional networking and teamwork.

Leave a Reply

Your email address will not be published. Required fields are marked *