Part of a series on Statistics |
Correlation and covariance |
---|
![](//upload.wikimedia.org/wikipedia/commons/thumb/6/6a/CorrelationIcon.svg/100px-CorrelationIcon.svg.png) |
For random vectors - Autocorrelation matrix
- Cross-correlation matrix
- Auto-covariance matrix
- Cross-covariance matrix
|
|
For deterministic signals |
|
Type of matrix in probability theory and statistics
In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The cross-covariance matrix of two random vectors
and
is typically denoted by
or
.
Definition
For random vectors
and
, each containing random elements whose expected value and variance exist, the cross-covariance matrix of
and
is defined by[1]: 336
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {cov} (\mathbf {X} ,\mathbf {Y} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {X} -\mathbf {\mu _{X}} )(\mathbf {Y} -\mathbf {\mu _{Y}} )^{\rm {T}}]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/96ab8cdb7a99b79fcc12aa96759624fca8288f92) | | (Eq.1) |
where
and
are vectors containing the expected values of
and
. The vectors
and
need not have the same dimension, and either might be a scalar value.
The cross-covariance matrix is the matrix whose
entry is the covariance
![{\displaystyle \operatorname {K} _{X_{i}Y_{j}}=\operatorname {cov} [X_{i},Y_{j}]=\operatorname {E} [(X_{i}-\operatorname {E} [X_{i}])(Y_{j}-\operatorname {E} [Y_{j}])]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/00317be01da6672a69243a46336219d92e1336c9)
between the i-th element of
and the j-th element of
. This gives the following component-wise definition of the cross-covariance matrix.
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\vdots &\vdots &\ddots &\vdots \\\\\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{n}-\operatorname {E} [Y_{n}])]\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/2ddb845e247c254b51496284762fbdba9532f4c4)
Example
For example, if
and
are random vectors, then
is a
matrix whose
-th entry is
.
Properties
For the cross-covariance matrix, the following basic properties apply:[2]
![{\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]-\mathbf {\mu _{X}} \mathbf {\mu _{Y}} ^{\rm {T}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/98310d15037b674359dab0b260360fa442336fa3)
![{\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {Y} ,\mathbf {X} )^{\rm {T}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/daf12901dc844654f91c2c6811798732a51f5649)
![{\displaystyle \operatorname {cov} (\mathbf {X_{1}} +\mathbf {X_{2}} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {X_{1}} ,\mathbf {Y} )+\operatorname {cov} (\mathbf {X_{2}} ,\mathbf {Y} )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b20c8202570f2305c0ae6c8f82f47b7b5753f3e5)
![{\displaystyle \operatorname {cov} (A\mathbf {X} +\mathbf {a} ,B^{\rm {T}}\mathbf {Y} +\mathbf {b} )=A\,\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )\,B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c250e1d6ebf75e17071bffe4fb55a2b209c0c4eb)
- If
and
are independent (or somewhat less restrictedly, if every random variable in
is uncorrelated with every random variable in
), then ![{\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=0_{p\times q}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ac0e09d8cc073c778b0e057ef5a0e3234eba07f8)
where
,
and
are random
vectors,
is a random
vector,
is a
vector,
is a
vector,
and
are
matrices of constants, and
is a
matrix of zeroes.
Definition for complex random vectors
If
and
are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition:
![{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,\mathbf {W} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {H}}]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/35f2df6d5550bc7eff958f026787a738eb3700d4)
For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:
![{\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,{\overline {\mathbf {W} }}){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {T}}]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b70f47f27e26849236481952e871b8e3bcf52708)
Uncorrelatedness
Two random vectors
and
are called uncorrelated if their cross-covariance matrix
matrix is a zero matrix.[1]: 337
Complex random vectors
and
are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if
.
References
- ^ a b Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
- ^ Taboga, Marco (2010). "Lectures on probability theory and mathematical statistics".