Tuesday 23 September 2014

High Resolution Tools for Spectral Analysis - Update

Following on from my initial enthusiasm for the code on the High Resolution Tools for Spectral Analysis page, I have say that I have been unable to get the code performing as I would like it for my intended application to price time series.

My original intent was to use the zero crossing period estimation function, the subject of my last few posts, to get a rough idea of the dominant cycle period and then use the most recent data in a rolling window of this length as input to the high resolution code. This approach, however, ran into problems.

Firstly, windows of just the dominant cycle length (approximately 10 to 30 data points only) would lead to all sorts of errors being thrown from the toolkit functions as well as core Octave functions, such as divide by zero warnings and cryptic error messages that even now I don't understand. My best guess here is that the amount of data available in such short windows is simply insufficient for the algorithm to work, in much the same way as the Fast Fourier transform may fail to work if given too little data that is not a power of 2 in length. It might be possible to substantially rewrite the relevant functions, but my understanding of the algorithm and the inner workings of Octave means this is well beyond my pay grade.

My second approach was to simply extend the amount of data available by using the Octave repmat function on the windowed data so that all the above errors disappeared. This had very hit and miss results - sometimes they were very accurate, other times just fair to middling, and on occasion just way off target. I suspect here that the problem is the introduction of signal artifacts via the use of the repmat function, which results in Aliasing of the underlying signal.

As a result I shall not continue with investigating this toolbox for now. Although I only investigated the use of the me.m function (there are other functions available) I feel that my time at the moment can be used more productively.

4 comments:

jim said...

any particular reasons why you just didn't try with a higher sampling rate to increase the # of observation in a 30 day period?

Dekalog said...

The sampling rate is completely dependent on the data provider. If one is using End of Day data only, a look back period of 30 days will only have 30 data points available. Of course if one has access to, for example, 5 minute bar data or tick data then the sampling rate could be increased.

Gumby said...

The high resolution tools you linked to are for zero mean stationary processes. Did you get it to work for a ranging market? It seems to me this would not be applicable to a trending market, unless you preprocessed the data, such as detrend.

Dekalog said...

Gumby,

I didn't get it to work to my satisfaction in any type of market. I've decided to work on other, different things for now. I may come back to this again in the future.