# Complex random variable

In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers.[1] Complex random variables can always be considered as pairs of real random variables: their real and imaginary parts. Therefore, the distribution of one complex random variable may be interpreted as the joint distribution of two real random variables.

Some concepts of real random variables have a straightforward generalization to complex random variables—e.g., the definition of the mean of a complex random variable. Other concepts are unique to complex random variables.

Applications of complex random variables are found in digital signal processing,[2] quadrature amplitude modulation and information theory.

## Definition

A complex random variable ${\displaystyle Z}$  on the probability space ${\displaystyle (\Omega ,{\mathcal {F}},P)}$  is a function ${\displaystyle Z\colon \Omega \rightarrow \mathbb {C} }$  such that both its real part ${\displaystyle \Re {(Z)}}$  and its imaginary part ${\displaystyle \Im {(Z)}}$  are real random variables on ${\displaystyle (\Omega ,{\mathcal {F}},P)}$ .

## Examples

### Simple example

Consider a random variable that may take only the three complex values ${\displaystyle 1+i,1-i,2}$  with probabilities as specified in the table. This is a simple example of a complex random variable.

Probability ${\displaystyle P(z)}$  Value ${\displaystyle z}$
${\displaystyle {\frac {1}{4}}}$  ${\displaystyle 1+i}$
${\displaystyle {\frac {1}{4}}}$  ${\displaystyle 1-i}$
${\displaystyle {\frac {1}{2}}}$  ${\displaystyle 2}$

The expectation of this random variable may be simply calculated: ${\displaystyle \operatorname {E} [Z]={\frac {1}{4}}(1+i)+{\frac {1}{4}}(1-i)+{\frac {1}{2}}2={\frac {3}{2}}.}$

### Uniform distribution

Another example of a complex random variable is the uniform distribution over the filled unit circle, i.e. the set ${\displaystyle \{z\in \mathbb {C} \mid |z|\leq 1\}}$ . This random variable is an example of a complex random variable for which the probability density function is defined. The density function is shown as the yellow disk and dark blue base in the following figure.

### Complex normal distribution

Complex Gaussian random variables are often encountered in applications. They are a straightforward generalization of real Gaussian random variables. The following plot shows an example of the distribution of such a variable.

## Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form ${\displaystyle P(Z\leq 1+3i)}$  make no sense. However expressions of the form ${\displaystyle P(\Re {(Z)}\leq 1,\Im {(Z)}\leq 3)}$  make sense. Therefore, we define the cumulative distribution ${\displaystyle F_{Z}:\mathbb {C} \to [0,1]}$  of a complex random variables via the joint distribution of their real and imaginary parts:

${\displaystyle F_{Z}(z)=F_{\Re {(Z)},\Im {(Z)}}(\Re {(z)},\Im {(z)})=P(\Re {(Z)}\leq \Re {(z)},\Im {(Z)}\leq \Im {(z)})}$

(Eq.1)

## Probability density function

The probability density function of a complex random variable is defined as ${\displaystyle f_{Z}(z)=f_{\Re {(Z)},\Im {(Z)}}(\Re {(z)},\Im {(z)})}$ , i.e. the value of the density function at a point ${\displaystyle z\in \mathbb {C} }$  is defined to be equal to the value of the joint density of the real and imaginary parts of the random variable evaluated at the point ${\displaystyle (\Re {(z)},\Im {(z)})}$ .

An equivalent definition is given by ${\displaystyle f_{Z}(z)={\frac {\partial ^{2}}{\partial x\partial y}}P(\Re {(Z)}\leq x,\Im {(Z)}\leq y)}$  where ${\displaystyle x=\Re {(z)}}$  and ${\displaystyle y=\Im {(z)}}$ .

As in the real case the density function may not exist.

## Expectation

### Definition

The expectation of a complex random variable is defined based on the definition of the expectation of a real random variable:[3]:p. 112

${\displaystyle \operatorname {E} [Z]=\operatorname {E} [\Re {(Z)}]+i\operatorname {E} [\Im {(Z)}]}$

(Eq.2)

Note that the expectation of a complex random variable does not exist if ${\displaystyle \operatorname {E} [\Re {(Z)}]}$  or ${\displaystyle \operatorname {E} [\Im {(Z)}]}$  does not exist.

If the complex random variable ${\displaystyle Z}$  has a probability density function ${\displaystyle f_{Z}(z)}$ , then the expectation is given by ${\displaystyle \operatorname {E} [Z]=\int _{\mathbb {C} }z\cdot f_{Z}(z)dz}$ .

If the complex random variable ${\displaystyle Z}$  has a probability mass function ${\displaystyle p_{Z}(z)}$ , then the expectation is given by ${\displaystyle \operatorname {E} [Z]=\sum _{z\in \mathbb {Z} }z\cdot p_{Z}(z)}$ .

### Properties

Whenever the expectation of a complex random variable exists, taking the expectation and complex conjugation commute:

${\displaystyle {\overline {\operatorname {E} [Z]}}=\operatorname {E} [{\overline {Z}}].}$

The expected value operator ${\displaystyle \operatorname {E} [\cdot ]}$  is linear in the sense that

${\displaystyle \operatorname {E} [aZ+bW]=a\operatorname {E} [Z]+b\operatorname {E} [W]}$

for any complex coefficients ${\displaystyle a,b}$  even if ${\displaystyle Z}$  and ${\displaystyle W}$  are not independent.

## Variance and pseudo-variance

### Definition variance

The variance is defined as:[3]:p. 117

${\displaystyle \operatorname {Var} [Z]=\operatorname {E} [|Z-\operatorname {E} [Z]|^{2}]=\operatorname {E} [|Z|^{2}]-|\operatorname {E} [Z]|^{2}}$

(Eq.3)

### Properties

The variance is always a nonnegative real number. It is equal to the sum of the variances of the real and imaginary part of the complex random variable:

${\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [\Re {(Z)}]+\operatorname {Var} [\Im {(Z)}].}$

The variance of a linear combination of complex random variables may be calculated using the following formula:

${\displaystyle \operatorname {Var} \left[\sum _{k=1}^{N}a_{k}Z_{k}\right]=\sum _{i=1}^{N}\sum _{j=1}^{N}a_{i}{\overline {a_{j}}}\operatorname {Cov} [Z_{i},Z_{j}].}$

### Definition pseudo-variance

The pseudo-variance is a special case of the pseudo-covariance and is given by

${\displaystyle \operatorname {J} _{ZZ}=\operatorname {E} [(Z-\operatorname {E} [Z])^{2}]=\operatorname {E} [Z^{2}]-(\operatorname {E} [Z])^{2}}$

(Eq.4)

Unlike the variance of ${\displaystyle Z}$ , which is always real and positive, the pseudo-variance of ${\displaystyle Z}$  is in general complex.

## Covariance and pseudo-covariance

### Definition

The covariance between two complex random variables ${\displaystyle Z,W}$  is defined as[3]:p. 119

${\displaystyle \operatorname {K} _{ZW}=\operatorname {Cov} [Z,W]=\operatorname {E} [(Z-\operatorname {E} [Z]){\overline {(W-\operatorname {E} [W])}}]=\operatorname {E} [Z{\overline {W}}]-\operatorname {E} [Z]\operatorname {E} [{\overline {W}}]}$

(Eq.5)

Notice the complex conjugation of the second factor in the definition. In contrast to real random variables, we also define a pseudo-covariance (also called complementary variance):

${\displaystyle \operatorname {J} _{ZW}=\operatorname {Cov} [Z,{\overline {W}}]=\operatorname {E} [(Z-\operatorname {E} [Z])(W-\operatorname {E} [W])]=\operatorname {E} [ZW]-\operatorname {E} [Z]\operatorname {E} [W]}$

(Eq.6)

The second order statistics are fully characterized by the covariance and the pseudo-covariance.

### Properties

The covariance has the following properties:

• ${\displaystyle \operatorname {Cov} [Z,W]={\overline {\operatorname {Cov} [W,Z]}}}$  (Conjugate symmetry)
• ${\displaystyle \operatorname {Cov} [\alpha Z,W]=\alpha \operatorname {Cov} [Z,W]}$  (Sesquilinearity)
• ${\displaystyle \operatorname {Cov} [Z,\alpha W]={\overline {\alpha }}\operatorname {Cov} [Z,W]}$
• ${\displaystyle \operatorname {Cov} [Z_{1}+Z_{2},W]=\operatorname {Cov} [Z_{1},W]+\operatorname {Cov} [Z_{2},W]}$
• ${\displaystyle \operatorname {Cov} [Z,W_{1}+W_{2}]=\operatorname {Cov} [Z,W_{1}]+\operatorname {Cov} [Z,W_{2}]}$
• ${\displaystyle \operatorname {Cov} [Z,Z]={\operatorname {Var} [Z]}}$

### Uncorrelatedness

Two complex random variables ${\displaystyle Z}$  and ${\displaystyle W}$  are called uncorrelated if

${\displaystyle \operatorname {K} _{ZW}=\operatorname {J} _{ZW}=0.}$

### Orthogonality

Two complex random variables ${\displaystyle Z}$  and ${\displaystyle W}$  are called orthogonal if

${\displaystyle \operatorname {E} [Z{\overline {W}}]=0}$ .

## Circular symmetry

Circular symmetry of complex random variables is a common assumption used in the field of wireless communication. A typical example of a circular symmetric complex random variable is the complex Gaussian random variable with zero mean and zero pseudo-covariance matrix.

### Definition

A complex random variable ${\displaystyle Z}$  is circularly symmetric if, for any deterministic ${\displaystyle \phi \in [-\pi ,\pi ]}$ , the distribution of ${\displaystyle e^{\mathrm {i} \phi }Z}$  equals the distribution of ${\displaystyle Z}$ .

### Properties

By definition, a circularly symmetric complex random variable has

${\displaystyle \operatorname {E} [Z]=\operatorname {E} [e^{\mathrm {i} \phi }Z]=e^{\mathrm {i} \phi }\operatorname {E} [Z]}$  for any ${\displaystyle \phi }$ .

Thus the expectation of a circularly symmetric complex random variable can only be either zero or undefined.

${\displaystyle \operatorname {E} [ZZ]=\operatorname {E} [e^{\mathrm {i} \phi }Ze^{\mathrm {i} \phi }Z]=e^{\mathrm {2} i\phi }\operatorname {E} [ZZ]}$  for any ${\displaystyle \phi }$ .

Thus the pseudo-variance of a circularly symmetric complex random variable can only be zero.

If ${\displaystyle Z}$  and ${\displaystyle e^{\mathrm {i} \phi }Z}$  have the same distribution, the phase of ${\displaystyle Z}$  must be uniformly distributed over ${\displaystyle [-\pi ,\pi ]}$  and independent of the amplitude of ${\displaystyle Z}$ .[4]

## Proper complex random variables

The concept of proper random variables is unique to complex random variables, and has no correspondent concept with real random variables.

### Definition

A complex random variable ${\displaystyle Z}$  is called proper if the following three conditions are all satisfied:

• ${\displaystyle \operatorname {E} [Z]=0}$
• ${\displaystyle \operatorname {Var} [Z]<\infty }$
• ${\displaystyle \operatorname {E} [Z^{2}]=0}$

This definition is equivalent to the following conditions. This means that a complex random variable is proper if, and only if:

• ${\displaystyle \operatorname {E} [Z]=0}$
• ${\displaystyle \operatorname {E} [\Re {(Z)}^{2}]=\operatorname {E} [\Im {(Z)}^{2}]\neq \infty }$
• ${\displaystyle \operatorname {E} [\Re {(Z)}\Im {(Z)}]=0}$

### Covariance matrix of the real and imaginary parts

For a general complex random variable, the pair ${\displaystyle (\Re {(Z)},\Im {(Z)})}$  has the covariance matrix

${\displaystyle {\begin{bmatrix}\operatorname {Var} [\Re {(Z)}]&\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]\\\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]&\operatorname {Var} [\Im {(Z)}]\end{bmatrix}}.}$

However, for a proper complex random variable, the covariance matrix of the pair ${\displaystyle (\Re {(Z)},\Im {(Z)})}$  has the following simple form:

${\displaystyle {\begin{bmatrix}{\frac {1}{2}}\operatorname {Var} [Z]&0\\0&{\frac {1}{2}}\operatorname {Var} [Z]\end{bmatrix}}}$ .

### Theorem

Every circularly symmetric complex random variable with finite variance is proper.

## Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random variables, which can be derived using the Triangle inequality and Hölder's inequality, is

${\displaystyle \left|\operatorname {E} \left[Z{\overline {W}}\right]\right|^{2}\leq \left|\operatorname {E} \left[\left|Z{\overline {W}}\right|\right]\right|^{2}\leq \operatorname {E} \left[|Z|^{2}\right]\operatorname {E} \left[|W|^{2}\right]}$ .

## Characteristic function

The characteristic function of a complex random variable is a function ${\displaystyle \mathbb {C} \to \mathbb {C} }$  defined by

${\displaystyle \varphi _{Z}(\omega )=\operatorname {E} \left[e^{i\Re {({\overline {\omega }}Z)}}\right]=\operatorname {E} \left[e^{i(\Re {(\omega )}\Re {(Z)}+\Im {(\omega )}\Im {(Z)})}\right].}$

1. ^ Eriksson, Jan; Ollila, Esa; Koivunen, Visa (2009). "Statistics for complex random variables revisited". Cite journal requires |journal= (help)