Pages

Sunday, 22 November 2015

Recent Readings and New Directions.

Since my last post I have been doing a fair bit of online research and fortunately I have discovered the following papers, which mesh nicely with what I am trying to do with Conditional Restricted Boltzmann Machines to model time series:-

Deep Learning Architecture for Univariate Time Series Forecasting
Temporal Autoencoding Restricted Boltzmann Machine
Temporal Autoencoding Improves Generative Models of Time Series
Deep Modelling Complex Couplings Within Financial Markets
Predicting Time Series of Railway Speed Restrictions with Time Dependent Machine Learning Techniques

The big take away from these readings is to explicitly model the autoregressive components via a Denoising Autoencoder and secondly, not to model a univariate time series in isolation, but model a multivariate time series where the "other" time series are either informative measures taken from the univariate series itself (informative indicators?) and/or related time series e.g in forex one could use concepts similar to fractional product inefficiency or on all markets the concept of Intermarket analysis.

For the nearest future I have therefore set myself the task of adapting my CRBM code to include the denoising autoencoder and to investigate the multivariate time series approach.

1 comment:

  1. Hi,

    Great you continue in this direction, I found another paper from haeusler about temporal RBMs and also some code

    http://www.diss.fu-berlin.de/diss/servlets/MCRFileNodeServlet/FUDISS_derivate_000000015130/diss_haeusler.pdf

    https://github.com/chausler/deep

    So far my experience with 'neuron based' nets like NNbackprop, SDAE, RBM or ELM is that they under perform e.g. comparing to SVM at least for 1 min data. The reason for it is most likely unclear net configuration (how many neurons in each layer ??) which cause unstable results and overfitting. I found some code for ELM for prunning the net from excess of neurons during training using GA but haven't tried yet.

    Good luck with it, Krzysztof

    ReplyDelete