Cross-correlation

From Wikipedia, the free encyclopedia

In statistics, the term cross-correlation is sometimes used to refer to the covariance cov(XY) between two random vectors X and Y, in order to distinguish that concept from the "covariance" of a random vector X, which is understood to be the matrix of covariances between the scalar components of X.

In signal processing, the cross-correlation (or sometimes "cross-covariance") is a measure of similarity of two signals, commonly used to find features in an unknown signal by comparing it to a known one. It is a function of the relative time between the signals, is sometimes called the sliding dot product, and has applications in pattern recognition and cryptanalysis.

For discrete functions fn and gn the cross-correlation is defined as

(f \star g)[n] \stackrel{\mathrm{def}}{=} \sum_j f^*[j]g[n+j]

where the sum is over the appropriate values of the integer j  and a superscript asterisk indicates the complex conjugate. For continuous functions f (x) and g (x) the cross-correlation is defined as

(f \star g)(x) \stackrel{\mathrm{def}}{=} \int f^*(t) g(x+t)\,dt

where the integral is over the appropriate values of t.

The cross-correlation is similar in nature to the convolution of two functions. Whereas convolution involves reversing a signal, then shifting it and multiplying by another signal, correlation only involves shifting it and multiplying (no reversing).

In an Autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero.

If X and Y are two independent random variables with probability distributions f and g, respectively, then the probability distribution of the difference XY is given by the cross-correlation f  \star g. In contrast, the convolution f * g gives the probability distribution of the sum X + Y.

Contents

[edit] Explanation

For example, consider two real valued functions f and g that differ only by a shift along the x-axis. One can calculate the cross-correlation to figure out how much g must be shifted along the x-axis to make it identical to f. The formula essentially slides the g function along the x-axis, calculating the integral of their product for each possible amount of sliding. When the functions match, the value of (f\star g) is maximized. The reason for this is that when lumps (positives areas) are aligned, they contribute to making the integral larger. Also, when the troughs (negative areas) align, they also make a positive contribution to the integral because the product of two negative numbers is positive.

With complex valued functions f and g, taking the conjugate of f ensures that aligned lumps (or aligned troughs) with imaginary components will contribute positively to the integral.

In econometrics, lagged cross-correlation is sometimes referred to as cross-autocorrelation (Campbell, Lo, and MacKinlay 1996).

[edit] Properties

  • The cross-correlation is related to the convolution by:
f\star g = (t \mapsto f^*(-t))*g

so that if either f or g is an even function

f\star g = f*g

Also: (f\star g)\star(f\star g)=(f\star f)\star (g\star g)

\mathcal{F}[f\star g]=(\mathcal{F}[f])^* \cdot (\mathcal{F}[g])

where \mathcal{F} denotes the Fourier transform, and an asterisk again indicates the complex conjugate. Coupled with fast Fourier transform algorithms, this property is often exploited for the efficient numerical computation of cross-correlations.

  • the cross correlation of a convolution of f and h with a function g is the convolution of the correlation of f and g with the kernel h:
(f * h) \star g = h*(f \star g)

[edit] Normalized cross-correlation

For image-processing applications in which the brightness of the image and template can vary due to lighting and exposure conditions, the images can be first normalized. This is typically done at every step by subtracting the mean and dividing by the standard deviation. That is, the cross-correlation of a template, t(x,y) with a subimage f(x,y) is

\sum_{x,y}\frac{(f(x,y) - \overline{f})(t(x,y) - \overline{t})}{\sigma_f \sigma_t}.

In functional analysis terms, this can be thought of as the dot product of two normalized vectors. That is, if

F(x,y) = f(x,y) - \overline{f}

and

T(x,y) = t(x,y) - \overline{t}

then the above sum is equal to

\left\langle\frac{F}{\|F\|},\frac{T}{\|T\|}\right\rangle

where \langle\cdot,\cdot\rangle is the inner product and \|\cdot\| is the L2 norm.

[edit] See also

[edit] External links