Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

difference between xcorr and crosscorr?

1,442 views
Skip to first unread message

Iris Ehlert

unread,
Feb 4, 2012, 8:22:17 AM2/4/12
to
Dear members of mathworks,

I did a cross correlation for two time series both with xcorr.m and crosscorr.m. The results are completely different, which definitely shouldn't be the case, since the documentation for both functions describe the same.

So I would really, really appreciate it, if you could tell me why this happens.

I used the same lags, same series - so it is really the function itself causing the different results.

Cheers,
ie.

ImageAnalyst

unread,
Feb 4, 2012, 9:17:17 AM2/4/12
to
I don't see crosscorr() - is that in some particular toolbox? Where
did you get it?

Steven_Lord

unread,
Feb 7, 2012, 9:42:34 AM2/7/12
to


"ImageAnalyst" <imagea...@mailinator.com> wrote in message
news:e40817cd-e78d-4206...@h3g2000yqe.googlegroups.com...
> I don't see crosscorr() - is that in some particular toolbox? Where
> did you get it?

It's in Econometrics Toolbox.

http://www.mathworks.com/help/toolbox/econ/crosscorr.html

while XCORR is in Signal Processing Toolbox.

http://www.mathworks.com/help/toolbox/signal/ref/xcorr.html

One difference I see between them at first glance is what they return and
what options they allow. Signal Processing Toolbox just returns the
correlations and lags while allowing the user to specify a normalization
method, while Econometrics Toolbox returns correlations, lags, and
confidence bounds but does not allow the user to specify a normalization
method.

--
Steve Lord
sl...@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Aino

unread,
Oct 17, 2012, 12:07:16 PM10/17/12
to
"Steven_Lord" <sl...@mathworks.com> wrote in message <jgrd8p$q21$1...@newscl01ah.mathworks.com>...
Hi all!

I have the same problem.. Apparently if you subtract the mean out of the input vectors, both the functions give the same result. Also if you don't normalize the results (if you comment row 190 in crosscorr), there will still be a difference between the results.

So which one is the correct one?

-Aino

Rick

unread,
Oct 18, 2012, 11:55:12 AM10/18/12
to
Iris,

The distinction between the 2 functions is likely due to the intended audience.

The CROSSCORR in the Econometrics Toolbox is designed for financial/economic users, wherein the standard approach is to remove the mean from each series. For example, see Box and Jenkins. (In fact, the usual way to compute the sample variance of any series is to first extract the mean.)

In economic applications, the usual approach is also to normalize by the product of the sample standard deviations of each series x(t) and y(t).

For a sample auto-correlation function (ACF), as computed by the Econometrics Toolbox function AUTOCORR, this has the added effect of producing an ACF whose value at lag zero is unity, since x(t) = y(t).

However, for CROSSCORR, the scaling does not ensure the XCF at lag zero is one.

Also, as mentioned in another response, for two series x and y, XCORR and CROSSCORR will compute the same XCF function for N lags (to within numerical precision) for calls like this:

c = xcorr(x - mean(x), y - mean(y), N, 'coeff');
C = crosscorr(y, x, N);

As far as which is correct, they're both correct ... it just depends on what you want.

HTH,
-Rick

"Iris Ehlert" wrote in message <jgjbe9$biq$1...@newscl01ah.mathworks.com>...

Ye Gao

unread,
Nov 27, 2012, 10:45:19 PM11/27/12
to
"Iris Ehlert" wrote in message <jgjbe9$biq$1...@newscl01ah.mathworks.com>...
Other than the superficial difference that "crosscorr" is in the econometric toolbox and "xcorr" is in the signal processing toolbox, I find that there are two important distinctions between "crosscorr" and "xcor".
1. Means are removed from input series for "crosscorr", but not for "xor".
2. For "crosscorr", the 1st input leads the 2nd input, while for "xcross" the 1st input lags the 2nd input.

Cheers,
Ye Gao

Greg Heath

unread,
Nov 29, 2012, 1:43:15 PM11/29/12
to
"Iris Ehlert" wrote in message <jgjbe9$biq$1...@newscl01ah.mathworks.com>...
If you standardize the inputs using ZSCORE or MAPSTD, you should get the same answer. I think that answer is consistent with an autocorrelation with unity amplitude at zero lag if the input functions are equal.

Hope this helps.

Greg

Subhra Dey

unread,
Jul 4, 2016, 10:38:11 AM7/4/16
to
"Ye Gao" <gaosa...@ucla.edu> wrote in message <k941cf$kin$1...@newscl01ah.mathworks.com>...
Dear Gao,

Are you sure about your point 2.

Cheers,
Subhra

E. Natasha Stavros

unread,
Jul 11, 2017, 1:46:17 PM7/11/17
to
Rick,

This was EXTREMELY useful! I have a question though- you said that which one you select depends on what you are trying to accomplish. Can you provide an example of when you would want to use one instead of the other? Specifically, when would there be a case to substract out the mean before cross correlating?

I have a time series over many years that has seasonal periodicity. As such, I removed the seasonal periodicity using additive adjustment. Now I have one time series that is the anomalies (i.e., seasonality removed) and one that is the moving average of counts. I'd like to do two things:
1. find the lead of the anomaly with highest correlation to a peak in counts
2. find the persistence of variable used in the anomaly - i.e., how long before the value of that variable on day 1 does not afftect the value later. This is autocorrelation, and using crosscorr, I could look to see when the correlation falls below the bounds, however, I'm not sure if I use the adjusted time series or the raw time series and I'm not sure if I want the mean subtracted out.

Thank you for any guidance!!!


"Rick" wrote in message <k5p8p0$it9$1...@newscl01ah.mathworks.com>...

Rick

unread,
Jul 12, 2017, 8:15:17 AM7/12/17
to
Natasha,

I honestly do not know how to solve your problem, but someone else on this thread may, so give this a little time to settle.

That said, I do have a few comments/observations.

The area I support is computational finance & economics. We deal almost entirely with stochastic processes, and so are ultimately concerned with the degree of "similarity" between 1 ("autocorr") or 2 ("crosscorr") time series. As such, we are usually not interested in any measure of "magnitude" or "scale".

In other disciplines, such as signal processing, people are often interested in measures of scale. For example, signal processing folks are often concerned about the energy content or power of a particular signal(s), and so removing the mean is not the preferred approach. In this sense, you can view auto- and cross-correlation as techniques similar in spirit to convolution, likely followed by subsequent frequency-domain spectral analysis.

The following wikipedia page actually has a decent discussion of these disciplines:

https://en.wikipedia.org/wiki/Cross-correlation

As for your specific problem, I suspect that you're on the right track.

Specifically, item (1) sounds as if your goal is purely to determine the degree of similarity, and so "crosscorr" (or an approach that subtracts the means) is all you need. That said, since both functions will give you a sense of similarity at various shifts, mean removal may not matter.

For item (2), mean removal may not matter either since it sounds as if you're looking for a "decay time", or some such metric. If this really is the case, then you could compute an autocorrelation using the "autocorr" function in the Econometrics Toolbox, or "xcorr" in Signal Processing to effect an auto-correlation as a cross-correlation of a signal with itself.

Again, I'd wait to see if someone in the signal processing are has any additional insights, as there may be other techniques or functionality better suited to your problem of which I'm unaware.

Best,
-Rick

"E. Natasha Stavros" <natasha...@jpl.nasa.gov> wrote in message <ok32su$rbm$1...@newscl01ah.mathworks.com>...

Greg Heath

unread,
Aug 9, 2017, 9:57:18 AM8/9/17
to
"Steven_Lord" <sl...@mathworks.com> wrote in message <jgrd8p$q21$1...@newscl01ah.mathworks.com>...
>
>
> "ImageAnalyst" <imagea...@mailinator.com> wrote in message
> news:e40817cd-e78d-4206...@h3g2000yqe.googlegroups.com...
> > I don't see crosscorr() - is that in some particular toolbox? Where
> > did you get it?
>
> It's in Econometrics Toolbox.
>
> http://www.mathworks.com/help/toolbox/econ/crosscorr.html
>
> while XCORR is in Signal Processing Toolbox.
>
> http://www.mathworks.com/help/toolbox/signal/ref/xcorr.html
>
> One difference I see between them at first glance is what they return and
> what options they allow. Signal Processing Toolbox just returns the
> correlations and lags while allowing the user to specify a normalization
> method, while Econometrics Toolbox returns correlations, lags, and
> confidence bounds but does not allow the user to specify a normalization
> method.

>> help crosscorr
crosscorr not found.

>> help xcorr
--- help for corr ---

corr Linear or rank correlation.
... blah, blah,blah

Reference page for corr (GEH: This yields doc corr)
Other functions named corr

>> doc xcorr
GEH: This just yields a search list of 11 documentation references and examples

>> help nncorr
Cross correlation between neural network time series

>> doc nncorr
Cross correlation between neural network time series

NNCORR has 4 options

The optional FLAG determines how nncorr normalizes correlations.
'biased' - scales the raw cross-correlation by 1/N.
'unbiased' - scales the raw correlation by 1/(N-abs(k)), where k
is the index into the result.
'coeff' - normalizes the sequence so that the correlations at
zero lag are identically 1.0.
'none' - no scaling (this is the default).

Whew, I'm getting dizzy!

Greg

PS: Don't forget the inverse fft of the power spectrum!
0 new messages