From f98a3d67dc4787cbd9018b5101d7d67f9a353f9a Mon Sep 17 00:00:00 2001 From: Isabel Slingerland Date: Thu, 28 Nov 2024 10:08:12 +0100 Subject: [PATCH] added reference links --- book/time_series/acf.md | 7 +++++-- book/time_series/modelling.md | 4 ++-- book/time_series/noise.ipynb | 4 ++-- 3 files changed, 9 insertions(+), 6 deletions(-) diff --git a/book/time_series/acf.md b/book/time_series/acf.md index 3cd7a62..f33e1fb 100644 --- a/book/time_series/acf.md +++ b/book/time_series/acf.md @@ -3,7 +3,7 @@ Before we can look into the modelling of a stochastic process using an Autoregressive (AR) model, we first need to introduce the autocovariance function (ACF) for a stationary time series, and describe the relationship between ACF and a power spectral density (PSD). -As in the Chapter on #TODO (add reference to obs theory), the variance component is often determined based on the precision of an observation (at a given epoch), and the covariance components quantitatively indicate the statistical dependence (or independence) between observations. In this case, dependence is inherently introduced by the physical processes that produce the signal (of which our time series is a sample), and in fact our time series methods seek to (mathematically) account for this. +As in [Observation theory](../observation_theory/01_Introduction.md), the variance component is often determined based on the precision of an observation (at a given epoch), and the covariance components quantitatively indicate the statistical dependence (or independence) between observations. In this case, dependence is inherently introduced by the physical processes that produce the signal (of which our time series is a sample), and in fact our time series methods seek to (mathematically) account for this. ## Autocovariance and autocorrelation @@ -51,7 +51,10 @@ Prove that $Cov(S_t, S_{t-\tau}) =Cov(S_t, S_{t+\tau})$: :class: tip, dropdown From the definition of covariance, we know that -$$ Cov(a,b) = Cov(b,a)$$ + +$$ +Cov(a,b) = Cov(b,a) +$$ Hence, we have that diff --git a/book/time_series/modelling.md b/book/time_series/modelling.md index c391c6f..2886812 100644 --- a/book/time_series/modelling.md +++ b/book/time_series/modelling.md @@ -27,11 +27,11 @@ Recall that the BLUE of $\mathrm{x}$ is: $$\hat{X}=(\mathrm{A}^T\Sigma_{Y}^{-1}\mathrm{A})^{-1}\mathrm{A}^T\Sigma_{Y}^{-1}Y,\hspace{10px}\Sigma_{\hat{X}}=(\mathrm{A}^T\Sigma_{Y}^{-1}\mathrm{A})^{-1}$$ -The BLUE of $Y$ and $\epsilon$ is +The BLUE of $Y$ is: $$\hat{Y}=\mathrm{A}\hat{X},\hspace{10px}\Sigma_{\hat{Y}}=\mathrm{A}\Sigma_{\hat{X}}\mathrm{A}^T$$ -and +and $\epsilon$ is: $$\hat{\epsilon}=Y-\hat{Y},\hspace{10px}\Sigma_{\hat{\epsilon}}=\Sigma_{Y}-\Sigma_{\hat{Y}}$$ diff --git a/book/time_series/noise.ipynb b/book/time_series/noise.ipynb index ba321f6..d8296c5 100644 --- a/book/time_series/noise.ipynb +++ b/book/time_series/noise.ipynb @@ -24,7 +24,7 @@ "\n", "* **Signal** - the meaningful information that we want to detect: deterministic characteristics by means of mathematical expressions to capture for example trend, seasonality and offsets.\n", "\n", - "* **Noise** - random and undesired fluctuation that interferes with the signal: stochastic process are needed to describe this. Parts of the time-correlated noise needs to be accounted for in prediction, see later {ref}`AR`. \n", + "* **Noise** - random and undesired fluctuation that interferes with the signal: stochastic process are needed to describe this. Parts of the time-correlated noise needs to be accounted for in predictions, see later {ref}`AR`. \n", "\n", "The example in {numref}`signal_noise` shows that the *signal* can be described by $\\cos(2\\pi t f) + \\sin(2\\pi t f)$. The stochastic model (assuming independent normally distributed observations) would be a scaled identity matrix with variance equal to 1 (middle panel) and 9 (bottom panel), respectively. The signal of interest has been entirely hidden in the background noise in the bottom panel. Techniques from signal processing can be used to detect the frequency.\n", "\n", @@ -61,7 +61,7 @@ "Most notable, all observations are uncorrelated (off-diagonal elements of the covariance matrix are equal to 0). When we compute the PSD, the resulting density will be flat over the entire range of frequencies. In other words, a white noise process has equal energy over all frequencies, just like white light. We will show this in the interactive plot at the bottom of this page.\n", "\n", "### Colored noise\n", - "In time series it is not guarantied that the individual observations are uncorrelated. At the bottom of this page you will find an interactive plot. You can select four different types of noise: white, pink, red and blue. The noise processes are plotted in combination with the PSD. The PSD #TODO(add ref to psd)is a measure of the power of the signal at different frequencies. The white noise process has a flat PSD, while the other noise processes have a different shape. The pink noise process has a PSD that decreases with frequency, the red noise process has a PSD that decreases quadratically with frequency, and the blue noise process has a PSD that increases with frequency. \n", + "In time series it is not guarantied that the individual observations are uncorrelated. At the bottom of this page you will find an interactive plot. You can select four different types of noise: white, pink, red and blue. The noise processes are plotted in combination with the PSD. The [PSD](../signal/spectral_est.md#power-spectral-density-psd) is a measure of the power of the signal at different frequencies. The white noise process has a flat PSD, while the other noise processes have a different shape. The pink noise process has a PSD that decreases with frequency, the red noise process has a PSD that decreases quadratically with frequency, and the blue noise process has a PSD that increases with frequency. \n", "\n" ] },