Bayesian Statistics: Time Series Analysis Coursera Quiz Answers

All Weeks Bayesian Statistics: Time Series Analysis Coursera Quiz Answers

Bayesian Statistics: Time Series Analysis Coursera Quiz Answers

Practice Quiz: Objectives of the course

Q1. In this course will focus on models that assume that (mark all the options that apply):

  • The observations are realizations from spatial processes, where the random variables are spatially related
  • The observations are realizations from time series processes, where the random variables are temporally related
  • The observations are realizations from independent random variables

Q2. In this course we will focus on the following topics

  • Some classes of models for non-stationary time series
  • Models for univariate time series
  • Models for multivariate time series
  • Some classes of models for stationary time series

Q3. Some of the goals of time series analysis that we will illustrate in this course include:

  • Online monitoring
  • Analysis and inference
  • Forecasting
  • Clustering

Q4. In this course we will study models and methods for

  • Equally spaced time series processes
  • Discrete time processes
  • Unequally spaced time series processes
  • Continuous time processes

Q5. In this course you will learn about

  • Nonparametric methods of estimation for time series analysis
  • Normal dynamic linear models for non-stationary univariate time series
  • Bayesian inference and forecasting for some classes of time series models
  • Spatio-temporal models
  • Non-linear dynamic models for non-stationary time series
  • Autoregressive processes

Quiz: Stationarity, the ACF, and the PACF

Q1. Yt​−Yt−1​=et​−0.8et−1​

How is this process written using backshift operator notation ({B}B) ?

  • (1−B)Yt​=(1−0.8B)et​
  • None of the above
  • BYt​=(1−0.8B)et​
  • B(Yt​−Yt−1​)=0.8Bet​

Q3. If \{Y_t\}{Yt​} is a strongly stationary time series process with finite first and second moments, the following statements are true:

  • {Yt​} is also weakly or second order stationary
  • {Yt​} is a Gaussian process
  • The variance of Yt​, Var(Y_t),Var(Yt​),changes over time
  • The expected value of Yt​, E(Y_t),E(Yt​),does not depend on t.t.

Q4. If \{Y_t\}{Yt​} is weakly or second order stationary with finite first and second moments, the following statements are true:

  • If \{Y_t\}{Yt​} is also a Gaussian process then \{Y_t\}{Yt​} is strongly stationary
  • {Yt​} is also strongly stationary
  • None of the above

Q5. W​hich of the following moving averages can be used to remove a period d=8d=8 from a time series?

  • 1​/8yt−4​+41​(yt−3​+yt−2​+yt−1​+yt​+yt+1​+yt+2​+yt+3​)+81​yt+4​
  • 1/8∑j=−88​ytk
  • 1/2​(yt−4​+yt−3​+yt−2​+yt−1​+yt​+yt+1​+yt+2​+yt+3​+yt+4​)
  • 1/8​(yt−4​+yt−3​+yt−2​+yt−1​+yt​+yt+1​+yt+2​+yt+3​+yt+4​)

Q6. Which of the following moving averages can be used to remove a period d=3d=3 from a time series?

  • 1/2​(yt−1​+yt​+yt+1​)
  • 1/3​(yt−1​+yt​+yt+1​)
  • None of the above

Quiz: The AR(1) definitions and properties

Q2.

  1. Which of the following AR(1) processes are stable and therefore stationary?
  • Yt​=0.9Yt−1​+ϵt​,ϵt​∼i.i.d.N(0,v)
  • Yt​=Yt−1​+ϵt​,ϵt​∼i.i.d.N(0,v)
  • Yt​=−2Yt−1​+ϵt​,ϵt​∼i.i.d.N(0,v)
  • Yt​=−0.8Yt−1​+ϵt​,ϵt​∼i.i.d.N(0,v)

Q3. Which of the statements below are true?

  • The ACF coefficients of an AR(1) with AR coefficient \phi \in (-1,1)ϕ∈(−1,1) and \phi \neq 0ϕ​=0 are zero after lag 1
  • The PACF coefficients of an AR(1) with AR coefficient \phi \in (-1,1)ϕ∈(−1,1) and \phi \neq 0ϕ​=0 are zero after lag 1
  • The ACF of an AR(1) with coefficient \phi=0.5ϕ=0.5 decays exponentially in an oscillatory manner
  • The ACF of an AR(1) with AR coefficient \phi=0.8ϕ=0.8 decays exponentially

Q4. Which of the following corresponds to the autocovariance function at lag h=2,h=2, \gamma(2)γ(2), of the autoregressive process Yt​=0.7Yt−1​+ϵt​,ϵt​∼i.i.d.N(0,v), with v=2.v=2.

  • 3.9216
  • 0.490.49
  • 1.9216

Q5. What is the PACF coefficient at lag 1 for the AR(1) process

yt​=−0.7yt−1​+ϵt​ with \epsilon_t \stackrel{iid}\sim N(0,1)ϵt​∼iidN(0,1)?

  • 0.70.7
  • -0.7−0.7
  • \approx 1.96≈1.96
  • 00

Q6. What is the autovariance function at lag 1, \gamma(1)γ(1) of the AR(1) process

yt​=0.6yt−1​+ϵt​ with \epsilon_t \stackrel{i.i.d.}{\sim} N(0,v)ϵt​∼i.i.d.N(0,v) ? with variance v=2v=2.

  • 1.5625
  • 1.875
  • 1
  • 0.6

Q7. Consider an AR(1) process y_t = -0.5 y_{t-1} + \epsilon_t,yt​=−0.5yt−1​+ϵt​, with \epsilon_t \stackrel{i.i.d.}{\sim} N(0,1)ϵt​∼i.i.d.N(0,1). Which of the following statements are true?

  • The autocovariance process of this function decays exponentially as a function of the lag hh and it is always negative
  • The autocovariance process of this function decays exponentially as a function of the lag hh and it is always positive
  • The PACF coefficient at lag 1 \phi(1,1)ϕ(1,1) is equal to -0.5−0.5
  • The PACF coefficients for lags greater than 1 are zero
  • The PACF coefficient at lag 1 \phi(1,1)ϕ(1,1) is equal to 0.50.5
  • The autocovariance process of this function decays exponentially as a function of the lag hh oscillating between negative and positive values

Week 02 : Properties of AR processes

Q1. Consider the following AR(2)AR(2) process,

Y_t = 0.5Y_{t-1} + 0.24Y_{t-2} + \epsilon_t, \quad \epsilon_t \sim \mathcal{N}(0, v).Yt​=0.5Yt−1​+0.24Yt−2​+ϵt​,ϵt​∼N(0,v).

Give the value of one of the reciprocal roots of this process.

Q2. Assume the reciprocal roots of an AR(2)AR(2) characteristic polynomial are 0.70.7 and -0.2.−0.2.

Which is the corresponding form of the autocorrelation function \rho(h)ρ(h) of this process?

  • ρ(h)=(a+bh)0.3h,h>0, where $a$ and $b$ are some constants.
  • ρ(h)=a(0.7)h+b(−0.3)h,h>0, where aa and bb are some constants.
  • ρ(h)=(a+bh)0.7h,h>0, where aa and bb are some constants.
  • ρ(h)=(a+bh)(0.3h+0.7h),h>0, where $a$ and $b$ are some constants.

Q3. Assume that an AR(2) process has a pair of complex reciprocal roots with modulus r = 0.95r=0.95 and period \lambda = 7.1.λ=7.1.

Which following options corresponds to the correct form of its autocorrelation function, \rho(h)ρ(h) ?

  • ρ(h)=a(0.95)hcos(7.1h+b), where aa and bb are some constants.
  • ρ(h)=a(0.95)hcos(2πh/7.1+b), where aa and bb are some constants.
  • ρ(h)=a0.95h,h>0, where aa and bb are some constants.
  • ρ(h)=(a+bh)0.95h, where aa and bb are some constants.

Q4. Given the following AR(2)AR(2) process,

Y_t = 0.5Y_{t-1} + 0.36Y_{t-2} + \epsilon_t, \quad \epsilon_t \sim \mathcal{N}(0, v).Yt​=0.5Yt−1​+0.36Yt−2​+ϵt​,ϵt​∼N(0,v).

The h=3h=3 steps-ahead forecast function f_t(3)ft​(3) has the following form:

  • ft​(3)=c1t​(1.1)3+c2t​(−2.5)3 for c_{1t}c1t​ and c_{2t}c2t​ constants.
  • ft​(3)=(0.9)3(c1t​+c2t​3) for c_{1t}c1t​ and c_{2t}c2t​ constants
  • ft​(3)=c1t​(3)0.9+c2t​(3)−0.4 for c_{1t}c1t​ and c_{2t}c2t​ constants.
  • ft​(3)=c1t​(0.9)3+c2t​(−0.4)3 for c_{1t}c1t​ and c_{2t}c2t​ constants.

Week 03: Practice Quiz The Normal Dynamic Linear Model

Q1. Which of the models below is a Dynamic Normal Linear Model?

  • Observation equation: y_t = a\theta^2_t + \epsilon_t, \quad \epsilon_t \sim \mathcal{N}(0, v), yt​=aθt2​+ϵt​,ϵt​∼N(0,v),
  • System equation: \theta_t = b\theta_{t-1} + c \frac{\theta_{t-1}}{1+ \theta^2_{t-1}} + \omega_t, \quad \omega_t \sim \mathcal{N}(0, w). θt​=bθt−1​+c1+θt−12​θt−1​​+ωt​,ωt​∼N(0,w).
  • Observation equation: y_t = \mu_t + \epsilon_t, \quad \epsilon_t \sim \mathcal{N}(0, v),yt​=μt​+ϵt​,ϵt​∼N(0,v),
  • System equation: \mu_t = \mu_{t-1} + \omega_t, \quad \omega_t \sim \mathcal{N}(0, w).μt​=μt−1​+ωt​,ωt​∼N(0,w).
  • Observation equation: y_t = \theta_t + \epsilon_t, \quad \epsilon_t \sim \mathcal{N}(0, v), yt​=θt​+ϵt​,ϵt​∼N(0,v),
  • System equation: \theta_t = b\theta_{t-1} + c \frac{\theta_{t-1}}{1+ \theta^2_{t-1}} + \omega_t, \quad \omega_t \sim \mathcal{N}(0, w). θt​=bθt−1​+c1+θt−12​θt−1​​+ωt​,ωt​∼N(0,w).

Q2. Consider the Normal Dynamic Linear Model \mathcal{M}: \left\{\bm{F}_t, \bm{G}_t, \cdot, \cdot\right\}, M:{Ft​,Gt​,⋅,⋅}, for t = 1, \dots, T.t=1,…,T. Let’s assume \bm{F}_tFt​ is K \times 1K×1 vector. What is the dimension of \bm{G}_t?Gt​?

  • T \times 1T×1
  • T \times TT×T
  • K \times KK×K
  • K \times 1K×1

Q3. Consider the third order polynomial Normal Dynamic Linear Model \mathcal{M}: \{\bm{F}, \bm{G}, \cdot, \cdot\}, M:{F,G,⋅,⋅}, where \bm{F} = (1 \quad 0 \quad 0)’F=(100)′ and \bm{G} = \bm{J}_3(1),G=J3​(1), where \bm{J}J is Jordan block given by

J_3(1) = \left(

100110011

\right) J3​(1)=⎝⎜⎛​100​110​011​⎠⎟⎞​

Given the posterior mean E (\bm{\theta}_t | D_t) = (m_t, b_t, g_t)’,E(θt​∣Dt​)=(mt​,bt​,gt​)′, which of the following options is the one corresponding to the forecast function f_t(h) \quad (h \geq 0)ft​(h)(h≥0) of the model?

  • ft​(h)=mt​+hbt​+h(h−1)gt​/2
  • ft​(h)=mt​+hbt
  • ft​(h)=mt​+hbt​+h(h+1)gt
  • ft​(h)=mt​+hbt​+h2gt

Week 04 : Quiz Seasonal Models and Superposition

Q2. Assume monthly data have an annual cycle and so the fundamental period is p=12.p=12. Further assume that we want to fit a model with a linear trend and seasonal component to this dataset. For the seasonal component, assume we only consider the fourth harmonic, i.e., we only consider the Fourier component for the frequency \omega= 2\pi 4/12= 2 \pi/3.ω=2π4/12=2π/3. What is the forecast function f_t(h), h \geq 0,ft​(h),h≥0, for a DLM with this linear trend and a seasonal component that considers only the fourth harmonic?

  • ft​(h)=at,0​+at,1​h
  • ft​(h)=at,0​+at,1​h+at,3​cos(32πh​)+at,4​sin(32πh​)
  • ft​(h)=at,1​cos(32πh​)+at,2​sin(32πh​)
  • ft​(h)=at,0​+at,1​h+at,3​cos(32πh​)+at,4​sin(32πh​)+at,5​(−1)h

Quiz : NDLM, Part II

Q1. Consider a full seasonal Fourier DLM with a fundamental period p=10.p=10. What is the dimension of the state vector \bm{\theta}_tθt​ at each time tt?

  • None of the above
  • 10
  • 9
  • 11
Get All Course Quiz Answers of Bayesian Statistics Specialization

Bayesian Statistics: From Concept to Data Analysis Quiz Answers

Bayesian Statistics: Techniques and Models Quiz Answers

Bayesian Statistics: Mixture Models Coursera Quiz Answers

Bayesian Statistics: Time Series Analysis Quiz Answer

Team Networking Funda
Team Networking Funda

We are Team Networking Funda, a group of passionate authors and networking enthusiasts committed to sharing our expertise and experiences in networking and team building. With backgrounds in Data Science, Information Technology, Health, and Business Marketing, we bring diverse perspectives and insights to help you navigate the challenges and opportunities of professional networking and teamwork.

Leave a Reply

Your email address will not be published. Required fields are marked *