Week4 - Time Series
4.3 - Time Series Concepts
Time Series Processes
Stochastic (Random) Process: \(\{\dots,Y_1,Y_2,\dots,Y_t,Y_{t+1},\dots\} = \{Y_t\}_{t=-\infty}^\infty\) is a sequence of random variables indexed by time.
Stationary Processes
A Stochastic process \(\{Y_t\}_{t=1}^\infty\) is strictly stationary if, for any given finite integer r and for any set of subscripts \(t_1,t_2,\dots,t_r\) the joint distribution of \((Y_t,Y_{t_1},Y_{t_2},\dots,Y_{t_r})\) depends only on \(t_1 - t, t_2-t, \dots, t_r -t\) but not on t.
Covariance (Weakly) Stationary Processes \(\{Y_t\}\)
\(E[Y_t] = \mu\) for all t
\(Var(Y_t) = \sigma^2\) for all t
\(cov(Y_t,Y_{t-j}) = \gamma_j\) depends on j and not on t. This is called the j-lag autocovariance
4.4 - Autocorrelation
with assumption of covariance stationarity: \(Corr(Y_t,Y_{t-j}) = \rho_j = \frac {cov(Y_t,Y_{t-j})}{sqrt{var(Y_t) var(Y_{t-j})}} = \frac {\gamma_j}{\sigma^2} \)
The autocorrelation function (ACF) is the plot of \rho_j against j.
4.5 - White Noise Processes
Gaussian White Noise Process
\(Y_t \sim iid N(0,\sigma^2)\) or \(Y_t \sim GW N(0,\sigma^2)\)
\(E[Y_t] = 0, ~ Var(Y_t)=\sigma^2\)
\(Y_t\) is independent of \(Y_s\) for \(t \neq s\) ⇒ \(cov(Y_t,Y_{t-s})=0\) for \(t \neq s\)
Independent White Noise Process
\(Y_t \sim iid (0,\sigma^2)\) or \(Y_t \sim IW N(0,\sigma^2)\)
\(E[Y_t] = 0, ~ Var(Y_t)=\sigma^2\)
\(Y_t\) is independent of \(Y_s\) for \(t \neq s\)
Weak White Noise Process
\(Y_t \sim W N(0,\sigma^2)\)
\(E[Y_t] = 0, ~ Var(Y_t)=\sigma^2\)
\(cov(Y_t,Y_{t-s})=0\) for \(t \neq s\)
4.6 - Nonstationary processes
Deterministically trending process
\(Y_t = \beta_0 + \beta_1 t + \epsilon_t, ~~ \epsilon_t \sim WN(0,\sigma^2)\)
\(E[Y_t] = \beta_0 + \beta_1 t\), depends on t.
\[X_t = Y_t - \beta_0 - \beta_1 t = \epsilon_t\]
Random Walk
\(Y_t = Y_{t-1} + \epsilon_t, ~~ \epsilon_t \sim WN(0,\sigma_\epsilon^2), ~~ Y_0\) is fixed.
Then: \(Y_t = Y_0 + \sum\limits_{j=1}^t \epsilon_j\) ⇒ \(Var(Y_t) = \sigma_\epsilon^2 \times t\) depends on t.
\[\Delta Y_t = Y_t - Y_{t-1} = \epsilon_t\]
4.7 - Moving Average Processes
MA(1) Model
\(Y_t = \mu + \epsilon_t + \theta \epsilon_{t-1}, ~~ -\infty \lt \theta \lt \infty, ~~ \epsilon_t \sim iid~ N(0,\sigma^2)\)
⇒ MA(1) is covariance stationary.
Example
\(r_t \sim iid ~ N(\mu_r,\sigma_r^2)\)
We consider a time serie on 2 months: \(r_t(2) = r_t + r_{t-1}\)
This monthly time serie will overlap by 1 month:
\[r_t(2) = r_t + r_{t-1}\]
\[r_{t-1}(2) = r_{t-1} + r_{t-2}\]
\[r_{t-2}(2) = r_{t-2} + r_{t-3}\]
⇒ Then \(\{r_t(2)\}\) follows an MA(1) process.
4.8 - Autoregressive Processes Part 1
\(Y_t - \mu = \phi (Y_{t-1} - \mu) + \epsilon_t, ~~ -1 \lt \phi \lt 1, ~~ \epsilon_t \sim iid ~N(0,\sigma_\epsilon^2)\)
⇒ AR(1) is covariance stationary provided \(-1 \lt \phi \lt 1\).
4.9 - Autoregressive Processes Part 2
\[E[Y_t] = \mu\]
\[Var(Y_t) = \sigma^2 = \frac {\sigma_\epsilon^2}{1- \phi^2}\]
\[Cov(Y_t,Y_{t-1}) = \gamma_1 = \sigma^2 \phi\]
\[Corr(Y_t,Y_{t-1}) = \rho_1 = \frac{\gamma_1}{\sigma^2} = \phi\]
\[Cov(Y_t,Y_{t-j}) = \gamma_j = \sigma^2 \phi^j\]
\[Corr(Y_t,Y_{t-j}) = \rho_j = \frac{\gamma_j}{\sigma^2} = \phi^j\]