# Signals ## Definitions > *Definition*: a signal is a function of space and time. > > * Output can be analog or quantised. > * Input can be continuous or discrete.
> *Definition*: a signal can be sampled at particular moments $k T_s$ in time, with $k \in \mathbb{Z}$ and $T_s \in \mathbb{R}$ the sampling period. For a signal $f: \mathbb{R} \to \mathbb{R}$ sampled with a sampling period $T_s$ may be denoted by > > $$ > f[k] = f(kT_s), \qquad \forall k \in \mathbb{Z}. > $$
> *Definition*: signal transformations on a function $x: \mathbb{R} \to \mathbb{R}$ obtaining the function $y: \mathbb{R} \to \mathbb{R}$ are given by > > | Signal transformation | Time | Amplitude | > | :-: | :-: | :-: | > | Reversal | $y(t) = x(-t)$ | $y(t) = -x(t)$ | > | Scaling | $y(t) = x(at)$ | $y(t) = ax(t)$ | > | Shifting | $y(t) = x(t - b)$ | $y(t) = x(t) + b$ | > > for all $t \in \mathbb{R}$. For sampled signals similar definitions hold. ### Symmetry > *Definition*: consider a signal $f: \mathbb{R} \to \mathbb{R}$ which is defined in an interval which is symmetric around $t = 0$, we define. > > * $f$ is *even* if $f(t) = f(-t)$, $\forall t \in \mathbb{R}$. > * $f$ is *odd* if $f(t) = -f(-t)$, $\forall t \in \mathbb{R}$. For sampled signals similar definitions hold. > *Theorem*: every signal can be decomposed into symmetric parts. ??? note "*Proof*:" Will be added later. ### Periodicity > *Definition*: a signal $f: \mathbb{R} \to \mathbb{R}$ is defined to be periodic in $T$ if and only if > > $$ > f(t + T) = f(t), \qquad \forall t \in \mathbb{R}. > $$ For sampled signals similar definitions hold. > *Theorem*: a summation of two periodic signals with periods $T_1, T_2 \in \mathbb{R}$ respectively is periodic if and only if > > $$ > \frac{T_1}{T_2} \in \mathbb{Q}. > $$ ??? note "*Proof*:" Will be added later. ### Signals > *Definition*: the Heaviside step signal $u: \mathbb{R} \to \mathbb{R}$ is defined by > > $$ > u(t) = \begin{cases} 1 &\text{ if } t > 0,\\ 0 &\text{ if } t < 0,\end{cases} > $$ > > for all $t \in \mathbb{R}$. For a sampled function the Heaviside step signal is given by $$ u[k] = \begin{cases} 1 \text{ if } k \geq 0, \\ 0 \text{ if } k < 0, \end{cases} $$ for all $k \in \mathbb{Z}$. > *Definition*: the rectangular signal $\text{rect}: \mathbb{R} \to \mathbb{R}$ is defined by > > $$ > \text{rect} (t) = \begin{cases} 1 &\text{ if } |t| < \frac{1}{2}, \\ 0 &\text{ if } |t| > \frac{1}{2},\end{cases} > $$ > > for all $t \in \mathbb{R}$. The rect signal can be normalised obtaining the scaled rectangular signal $D: \mathbb{R} \to \mathbb{R}$ defined by $$ D(t, \varepsilon) = \begin{cases} \frac{1}{\varepsilon} &\text{ if } |t| < \frac{\varepsilon}{2},\\ 0 &\text{ if } |t| > \frac{\varepsilon}{2},\end{cases} $$ for all $t \in \mathbb{R}$. The following signal has been derived from the scaled rectangular signal $D: \mathbb{R} \to \mathbb{R}$ used on a signal $f: \mathbb{R} \to \mathbb{R}$ for $$ \lim_{\varepsilon \;\downarrow\; 0} \int_{-\infty}^{\infty} f(t) D(t, \varepsilon)dt = \lim_{\varepsilon \;\downarrow\; 0} \frac{1}{\varepsilon} \int_{-\frac{\varepsilon}{2}}^{\frac{\varepsilon}{2}} f(t) dt = f(0), $$ using the mean [value theorem for integrals](../../../mathematics/calculus/integration.md#the-mean-value-theorem-for-integrals). > *Definition*: the Dirac signal $\delta$ is a generalized signal defined by the properties > > $$ > \begin{align*} > \delta(t - t_0) = 0 \quad \text{ for } t \neq t_0,& \\ > \int_{-\infty}^\infty f(t) \delta(t - t_0) dt = f(t_0),& > \end{align*} > $$ > > for a signal $f: \mathbb{R} \to \mathbb{R}$ continuous in $t_0$. For sampled signals the $\delta$ signal is given by $$ \delta[k] = \begin{cases} 1 &\text{ if } k = 0, \\ 0 &\text{ if } k \neq 0.\end{cases} $$ ## Signal sampling We already established that a signal $f: \mathbb{R} \to \mathbb{R}$ can be sampled with a sampling period $T_s \in \mathbb{R}$ obtaining $f[k] = f(kT_s)$ for all $k \in \mathbb{Z}$. We can also define a *time-continuous* signal $f_s: \mathbb{R} \to \mathbb{R}$ that represents the sampled signal using the Dirac signal, obtaining $$ f_s(t) = f(t) \sum_{k = - \infty}^\infty \delta(t - k T_s), \qquad \forall t \in \mathbb{R}. $$ > *Definition*: the sampling signal or impulse train $\delta_{T_s}: \mathbb{R} \to \mathbb{R}$ is defined as > > $$ > \delta_{T_s}(t) = \sum_{k = - \infty}^\infty \delta(t - k T_s) > $$ > > for all $t \in \mathbb{R}$ with a sampling period $T_s \in \mathbb{R}$. Then integration works out since we have $$ \int_{-\infty}^\infty f(t) \delta_{T_s}(t) dt = \sum_{k = -\infty}^\infty \int_{-\infty}^\infty f(t) \delta(t - k T_s) dt = \sum_{k = -\infty}^\infty f [k], $$ by definition. ## Convolutions > *Definition*: let $f,g: \mathbb{R} \to \mathbb{R}$ be two continuous signals, the convolution product is defined as > > $$ > f(t) * g(t) = \int_{-\infty}^\infty f(u)g(t-u)du > $$ > > for all $t \in \mathbb{R}$.
> *Proposition*: the convolution product is commutative, distributive and associative. ??? note "*Proof*:" Will be added later. > *Theorem*: let $f: \mathbb{R} \to \mathbb{R}$ be a signal then we have for the convolution product between $f$ and the Dirac signal $\delta$ and some $t_0 \in \mathbb{R}$ > > $$ > f(t) * \delta(t - t_0) = f(t - t_0) > $$ > > for all $t \in \mathbb{R}$. ??? note "*Proof*:" let $f: \mathbb{R} \to \mathbb{R}$ be a signal and $t_0 \in \mathbb{R}$, using the definition of the Dirac signal $$ f(t) * \delta(t - t_0) = \int_{-\infty}^\infty f(u) \delta(t - t_0 - u)du = f(t - t_0), $$ for all $t \in \mathbb{R}$. In particular $f(t) * \delta(t) = f(t)$ for all $t \in \mathbb{R}$; $\delta$ is the unity of the convolution. The average value of a signal $f: \mathbb{R} \to \mathbb{R}$ for an interval $\varepsilon \in \mathbb{R}$ may be given by $$ f(t) * D(t, \varepsilon) = \frac{1}{\varepsilon} \int_{t - \frac{\varepsilon}{2}}^{t + \frac{\varepsilon}{2}} f(u)du. $$ For sampled/discrete signals we have a similar definition for the convolution product, given by $$ f[k] * g[k] = \sum_{m = -\infty}^{\infty} f[m]g[k - m], $$ for all $k \in \mathbb{Z}$. ## Correlations > *Definition*: let $f,g: \mathbb{R} \to \mathbb{R}$ be two continuous signals, the cross-correlation is defined as > > $$ > f(t) \star g(t) = \int_{-\infty}^\infty f(u) g(t + u)du > $$ > > for all $t \in \mathbb{R}$. Especially the auto-correlation of a continuous signal $f: \mathbb{R} \to \mathbb{R}$ given by $f(t) \star f(t)$ for all $t \in \mathbb{R}$ is useful, as it can detect periodicity. This is proved in the section [Fourier series](fourier-series.md). For sampled/discrete signals a similar definition exists given by $$ f[k] \star g[k] = \sum_{m = -\infty}^{\infty} f[m]g[k + m], $$ for all $k \in \mathbb{Z}$.