When $x$ is a zero mean random variable then
$$\sum_{n=1}^N x_n x_n^T = N \sigma^2_x\,\text,$$
where the variance is $\sigma^2_x$.
I'm considering Complex Normal Distributions where the real and imaginary part are uncorrelated.
https://en.wikipedia.org/wiki/Complex_normal_distribution explains about the form of the distribution for complex normal distribution.
I have a confusion because the denominator for the real case in the distribution has a 2 but for the complex that is not there.
Does this mean that if $x$ is a complex valued random variable, then the variance becomes half i.e., $$\sum_{n=1}^N x_n x_n^H = N \sigma^2_x/2\,\text,$$ where the variance is $\sigma^2_x/2$ because the variance gets equally distributed in the real and imaginary component?
I have this doubt because when implementing, if I need to generate a complex noise of variance 1, I would be doing (in Matlab)
noise = sqrt(1/2) * (randn(N,1) + 1j*randn(N,1))
Since each component (real and imaginary) needs to have variance 1/2, such that their sum becomes 1.
So, the variance $\sigma^2$ is half mathematically. Is my understanding correct?
UPDATE based on valuable information provided under the comments : I am considering at the circularly symmetric complex normal distribution, where real and imaginary part are completely uncorrelated.
Answer
I will focus on the reason of the factor $1/2$ and leave aside the estimation things.
The exact understanding should be : if a scalar Gaussian random variable (rv) is circular symmetric, its real and imaginary parts must be uncorrelated (= independent if they are assumed jointly Gaussian) and identically distributed with zero mean. Thus, your Matlab code is correct for a rv $\sim CN(0,1)$.
The story behind is that a complex random variable (rv) is simply a vector of two real random variables. A vector of $n$ complex rv is indeed a vector of $2n$ real rv.
You are talking about the case $n=1$, 1 complex Gaussian rv $Z = Z_r + jZ_i$ or a vector of 2 real Gaussian rv $[Z_r,Z_i]^T$. As with real Gaussian rv which is described by its variance, a real vector $[Z_r,Z_i]^T$ must be described by its covariance matrix $$E[[Z_r,Z_i]^T \times [Z_r,Z_i]] = E[\begin{bmatrix} Z_r^2 & Z_rZ_i \\ Z_iZ_r & Z_i^2 \\ \end{bmatrix}]$$
Take a look at the variance $E[ZZ^H]=E[Z_r^2+Z_i^2]$ and pseudo-variance $E[ZZ^T]=E[Z_r^2-Z_i^2+j2Z_rZ_i]$, the covariance matrix of real vector (or complex scalar) is fully described by the variance and pseudo-variance. Thus, you need both variance and pseudo-variance to characterize a complex Gaussian rv (with the prior condition that its real and imaginary parts are jointly Gaussian).
Now we use the circular symmetry property : $e^{j\phi}Z$ has the same probability distribution as $Z$ for all real $\phi$. This leads to $E[e^{j\phi}Z(e^{j\phi}Z)^T] = E[e^{j2\phi}ZZ^T] = E[ZZ^T]$ for all $\phi$ thus $E[ZZ^T] = 0$ and variance $E[ZZ^H]$ is sufficient statistic for $Z$. Note that $E[ZZ^T] = E[Z_r^2-Z_i^2+j2Z_rZ_i] \implies E[Z_r^2] = E[Z_i^2]$ and $E[Z_rZ_i]=0$, the real and imaginary parts are uncorrelated then independent (because they are Gaussian), with the same variance, this is the reason of the factor $1/2$.
To sum up, your code is correct because you are estimating circular symmetric complex Gaussian rv. The jointly Gaussian assumption between real and imaginary parts must be used. If this is not about circular symmetric rv (or real random vector with two elements), you must calculate the pseudo-variance.
For more details and to understand the formula of the wikipedia article, you can read this article Circular Symmetric Gaussian R.Gallager.
No comments:
Post a Comment