Topic 1. Covariance Stationarity Conditions
Topic 2. Autocovariance and Autocorrelation Functions
Topic 3. White Noise
Topic 4. Time Series Forecasting. Time Series Forecasting
For a time series to be covariance stationary, it must exhibit three properties:
What is Covariance Structure?
Q1. The conditions for a time series to exhibit covariance stationarity are least likely to include:
A. a stable mean.
B. a finite variance.
C. a finite number of observations.
D. autocovariances that do not depend on time.
Explanation: C is correct.
In theory, a time series can be ininite in length and still be covariance stationary.
To be covariance stationary, a time series must have a stable mean, a stable
covariance structure (i.e., autocovariances depend only on displacement, not on
time), and a inite variance.
Autocovariance Function:
Autocorrelation Function (ACF):
Q2. As the number of lags or displacements becomes large, autocorrelation functions (ACFs) will approach:
A. −1.
B. 0.
C. 0.5.
D. +1.
Explanation: B is correct.
One feature that all ACFs have in common is that autocorrelations approach zero as the number of lags or displacements gets large.
Q3. Which of the following statements about white noise is most accurate?
A. All serially uncorrelated processes are white noise.
B. All Gaussian white noise processes are independent white noise.
C. All independent white noise processes are Gaussian white noise.
D. All serially correlated Gaussian processes are independent white noise.
Explanation: B is correct.
If a white noise process is Gaussian (i.e., normally distributed), it follows that the process is independent white noise. However, the reverse is not true; there can be independent white noise processes that are not normally distributed. Only those serially uncorrelated processes that have a zero mean and constant variance are white noise.
Topic 1. Autoregressive Processes
Topic 2. Estimating Autoregressive Parameters using Yule-Walker Equation
Topic 3. Moving Average (MA) Processes
Topic 4. Properties of Moving Average (MA) Processes
Topic 5. Lag Operators
Q1. Which of the following conditions is necessary for an autoregressive (AR) process to be covariance stationary?
A. The value of the lag slope coefficients should add to 1.
B. The value of the lag slope coefficients should all be less than 1.
C. The absolute value of the lag slope coefficients should be less than
1.
D. The sum of the lag slope coefficients should be less than 1.
Explanation: D is correct.
In order for an AR process to be covariance stationary, the sum of each of the slope coeficients should be less than 1.
Q2. Which of the following statements is a key differentiator between a moving average (MA) representation and an autoregressive (AR) process?
A. An MA representation shows evidence of autocorrelation cutoff.
B. An AR process shows evidence of autocorrelation cutoff.
C. An unadjusted MA process shows evidence of gradual autocorrelation decay.
D. An AR process is never covariance stationary.
Explanation: A is correct.
A key difference between an MA representation and an AR process is that the MA process shows autocorrelation cutoff while an AR process shows a gradual decay in autocorrelations.
Q3. Assume in an autoregressive [AR(1)] process that the coefficient for the lagged observation of the variable being estimated is equal to 0.75. According to the Yule- Walker equation, what is the second-period autocorrelation?
A. 0.375.
B. 0.5625.
C. 0.75.
D. 0.866.
Explanation: B is correct.
The coefficient is equal to 0.75, so using the concept derived from the Yule-Walker equation, the first-period autocorrelation is 0.75 (i.e., ), and the second- period autocorrelation is 0.5625.
Q4. Which of the following statements is most likely a purpose of the lag operator?
A. A lag operator ensures that the parameter estimates are consistent.
B. An autoregressive (AR) process is covariance stationary only if its lag polynomial is invertible.
C. Lag polynomials can be multiplied.
D. A lag operator ensures that the parameter estimates are unbiased.
Explanation: D is correct.
The reliability of the slope coefficient is inversely related to its variance and the variance of the slope coefficient (β) increases with variance of the error term and decreases with the variance of the explanatory variable.
Topic 1. Autoregressive Moving Average (ARMA) Processes
Topic 2. Application of AR, MA, and ARMA Processes
Topic 3. Sample and Partial Autocorrelations
Topic 4. Testing Autocorrelations
Topic 5. Modeling Seasonality in an ARMA
Q1. Which of the following statements about an autoregressive moving average (ARMA) process is correct?
I. It involves autocorrelations that decay gradually.
II. It combines the lagged unobservable random shock of the MA process with the observed lagged time series of the AR process.
A. I only.
B. II only.
C. Both I and II.
D. Neither I nor II.
Explanation: C is correct.
The ARMA process is important because its autocorrelations decay gradually and because it captures a more robust picture of a variable being estimated by including both lagged random shocks and lagged observations of the variable being estimated. The ARMA model merges the lagged random shocks from the MA process and the lagged time series variables from the AR process.
Q2. Which of the following statements is correct regarding the usefulness of an
autoregressive (AR) process and an autoregressive moving average (ARMA) process when modeling seasonal data?
I. They both include lagged terms and, therefore, can better capture a relationship
in motion.
II. They both specialize in capturing only the random movements in time series data.
A. I only.
B. II only.
C. Both I and II.
D. Neither I nor II
Explanation: A is correct.
Both AR models and ARMA models are good at forecasting with seasonal patterns because they both involve lagged observable variables, which are best for capturing a relationship in motion. It is the moving average representation that is best at capturing only random movements.
Q3. To test the hypothesis that the autocorrelations of a time series are jointly equal to zero based on a small sample, an analyst should most appropriately calculate:
A. a Ljung-Box (LB) Q-statistic.
B. a Box-Pierce (BP) Q-statistic.
C. either a Ljung-Box (LB) or a Box-Pierce (BP) Q-statistic.
D. neither a Ljung-Box (LB) nor a Box-Pierce (BP) Q-statistic.
Explanation: A is correct.
The LB Q-statistic is appropriate for testing this hypothesis based on a small sample.