>[!example] Learning goals
> 1. What is Reynolds decomposition?
> 2. What is the difference between the ensemble average, spatial average, and time average?
> 3. Under which conditions can we use time or/and space averages?
> 4. What is ergodicity?
## Means and perturbations
Reynolds decomposition is the technique of splitting an arbitrary variable $a$ that varies in time and space as
$
a(x, y, z, t) = \overline{a}(x, y, z, t) + a^\prime(x, y, z, t)
$
^eq-mean-ensemble
where $\overline{a}$ is the mean, and $a^\prime$ is the perturbation that is the deviation from the mean. The idea behind this decomposition is that the fluctuations due to turbulence are contained in the perturbation, and other slower variations are in the mean. The validity of this separation depends on the existence of the spectral gap.
Theoretically, the mean $\overline{a}$ is computed from an infinite number of realizations of the same situation. The mean $\overline{a}$ will then represent the slower variations, while $a^\prime$ quantifies the chaotic turbulence-driven fluctuations. This average would be the **ensemble average** and is a function of space and time, and therefore can be computed in any transient and spatially heterogeneous system.
In practice, the ensemble mean can rarely be used. In field observations, we have only one realization, and in laboratory experiments or computer simulations producing an ensemble large enough is often unattainable. Therefore, we often resort to a **time average** or/and a **horizontal spatial average** to separate variables into means and perturbations. By doing so, we rely on the assumption of ergodicity.
> [!definition] Ergodicity
> *At any given point the time average of a variable equals the horizontal spatial average as well as the ensemble average.*
Often, we need to resort to a less strict requirement by only assuming it in space or time.
## Rules to remember
In the computation of averages, we require that the averaging operator fulfills the following rules
> [!example] Averaging rules
> In the rules, we assume $u, w$ to be variables, and $c$ a constant.
> 1. The average of a constant is the constant
> $\overline{c} = c$
> 1. The average of an average is the average
> $\overline{\overline{u}} = \overline{u}$
> 1. The average of a constant times a variable is the constant times the average
> $\overline{c\,u} = c\,\overline{u}$
> 1. The average of a sum is the sum of the averages
> $\overline{u + w} = \overline{u} + \overline{w}$
> ^eq-sum-avg
> 1. The average of a multiplication of an average and variable is the multiplication of the averages
> $\overline{\overline{u}\,w} = \overline{u}\,\overline{w}$
> 1. The average of a (partial) derivative is the (partial) derivative of the average
> $\overline{\dfrac{\partial u}{\partial t}} = \dfrac{\partial \overline{u}}{\partial t}$
^box-rules
From the rules above, we can derive two essential properties for the analysis of turbulence
1. The average of the perturbations is zero:
$\overline{u^\prime} = 0$
2. The average of the multiplication of two variables is the multiplication of the averages plus their covariance
$ \overline{u\,w} = \overline{u}\,\overline{w} + \overline{u^\prime w^\prime}$ ^eq-flux-rule
In particular the [[Reynolds decomposition#^eq-flux-rule|latter rule]] deserves attention, because it is in this rule that the turbulent flux arises.
## Exercises
> [!question] Exercise
> Apply Reynolds decomposition to the expression below, and discuss the meaning of the terms that remain
> $ \overline{w^2} $
> [!success]- Answer
> Application of Reynolds decomposition to $w$ in the expression in the question gives
> $ \begin{align}
> \overline{w^2} &= \overline{ \left( \overline{w} + w^\prime \right)^2 }\\
> &= \overline{\overline{w}^2 + 2\,\overline{w}\,w^\prime + w^{\prime^2}}
> \end{align} $
> Then the application of [[Reynolds decomposition#^box-rules | rules 3 and 4]] gives
> $ \begin{align}
> \overline{w^2} &= \overline{\overline{w}^2} + \overline{2\,\overline{w}\,w^\prime} + \overline{w^{\prime^2}}\\
> &= \overline{w}^2 + 2\,\overline{w}\,\overline{w^\prime} + \overline{w^{\prime^2}}
> \end{align} $
> The second term contains the average of a perturbation, and hence vanishes, leading to the final result
> $ \begin{align}
> \overline{w^2} &= \overline{w}^2 + \overline{w^{\prime^2}}
> \end{align} $
> showing that the mean of a variable squared is the sum of its squared mean and its variance.