Monday, 30 December 2024

A "New" Way to Smooth Price

Below is code for an Octave compiled C++ .oct function to smooth price data.
#include "octave oct.h"
#include "octave dcolvector.h"
#include "octave dmatrix.h"
#include "GenFact.h"
#include "GramPoly.h"
#include "Weight.h"

DEFUN_DLD ( double_smooth_proj_2_5, args, nargout,
"-*- texinfo -*-\n\
@deftypefn {Function File} {} double_smooth_proj_2_5 (@var{input_vector})\n\
This function takes an input series and smooths it by projecting a 5 bar rolling linear fit\n\
3 bars into the future and using these 3 bars plus the last 3 bars of the rolling input\n\
to fit a FIR filter with a 2.5 bar lag from the last projected point, i.e. a 0.5 bar\n\
lead over the last actual rolling point in the series. This is averaged with the previously\n\
calculated such smoothed point for a theoretical zero-lagged smooth. This smooth is\n\
again smoothed as above for a smooth of the smooth, i.e. a double-smooth of the\n\
original input series. The double_smooth and smooth are the function outputs.\n\
@end deftypefn" )

{
octave_value_list retval_list ;
int nargin = args.length () ;

// check the input arguments
if ( nargin > 1 ) // there must be a single, input price vector
   {
   error ("Invalid arguments. Inputs are a single, input price vector.") ;
   return retval_list ;
   }

if ( args(0).length () < 5 )
   {
   error ("Invalid 1st argument length. Input is a price vector of length >= 5.") ;
   return retval_list ;
   }
// end of input checking

ColumnVector input = args(0).column_vector_value () ;
ColumnVector smooth = args(0).column_vector_value () ;
ColumnVector double_smooth = args(0).column_vector_value () ;

// create the fit coefficients matrix
int p = 5 ;             // the number of points in calculations
int m = ( p - 1 ) / 2 ; // value to be passed to call_GenPoly_routine_from_octfile
int n = 1 ;             // value to be passed to call_GenPoly_routine_from_octfile
int s = 0 ;             // value to be passed to call_GenPoly_routine_from_octfile

  // create matrix for fit coefficients
  Matrix fit_coefficients_matrix ( 2 * m + 1 , 2 * m + 1 ) ;
  // and assign values in loop using the Weight.h recursive Gram Polynomial C++ headers
  for ( int tt = -m ; tt < (m+1) ; tt ++ )
  {
        for ( int ii = -m ; ii < (m+1) ; ii ++ )
        {
        fit_coefficients_matrix ( ii + m , tt + m ) = Weight( ii , tt , m , n , s ) ;
        }
  }

  // create matrix for slope coefficients
  Matrix slope_coefficients_matrix ( 2 * m + 1 , 2 * m + 1 ) ;
  s = 1 ;
  // and assign values in loop using the Weight.h recursive Gram Polynomial C++ headers
  for ( int tt = -m ; tt < (m+1) ; tt ++ )
  {
        for ( int ii = -m ; ii < (m+1) ; ii ++ )
        {
        slope_coefficients_matrix ( ii + m , tt + m ) = Weight( ii , tt , m , n , s ) ;
        }
  }

  Matrix smooth_coefficients ( 1 , 5 ) ;
  // fill the smooth_coefficients matrix, smooth_coeffs = ( 9/24 ) .* fit_coeffs + ( 7/12 ) .* slope_coeffs + [ 0 ; 1/24 ; 1/8 ; 5/24 ; 1/4 ] ;
  smooth_coefficients( 0 , 0 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 0 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 0 , 4 ) ;
  smooth_coefficients( 0 , 1 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 1 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 1 , 4 ) + ( 1.0 / 24.0 ) ;
  smooth_coefficients( 0 , 2 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 2 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 2 , 4 ) + ( 1.0 / 8.0 ) ;
  smooth_coefficients( 0 , 3 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 3 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 3 , 4 ) + ( 5.0 / 24.0 ) ;
  smooth_coefficients( 0 , 4 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 4 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 4 , 4 ) + ( 1.0 / 4.0 ) ;

    for ( octave_idx_type ii (4) ; ii < args(0).length () ; ii++ )
        {

         smooth( ii ) = smooth_coefficients( 0 , 0 ) * input( ii - 4 ) + smooth_coefficients( 0 , 1 ) * input( ii - 3 ) + smooth_coefficients( 0 , 2 ) * input( ii - 2 ) +
                        smooth_coefficients( 0 , 3 ) * input( ii - 1 ) + smooth_coefficients( 0 , 4 ) * input( ii ) ;

         double_smooth( ii ) = smooth_coefficients( 0 , 0 ) * smooth( ii - 4 ) + smooth_coefficients( 0 , 1 ) * smooth( ii - 3 ) + smooth_coefficients( 0 , 2 ) * smooth( ii - 2 ) +
                               smooth_coefficients( 0 , 3 ) * smooth( ii - 1 ) + smooth_coefficients( 0 , 4 ) * smooth( ii ) ;

        }

retval_list( 0 ) = double_smooth ;
retval_list( 1 ) = smooth ;

return retval_list ;

} // end of function

Rather than describe it, I'll just paste the "help" description below:-

"This function takes an input series and smooths it by projecting a 5 bar rolling linear fit 3 bars into the future and using these 3 bars plus the last 3 bars of the rolling input to fit a FIR filter with a 2.5 bar lag from the last projected point, i.e.  a 0.5 bar lead over the last actual rolling point in the series.  This is averaged with the previously calculated such smoothed point for a theoretical zero-lagged smooth.  This smooth is again smoothed as above for a smooth of the smooth, i.e.  a double-smooth of the original input series.  The double_smooth and smooth are the function outputs."

The above function calls previous code of mine to calculate savitzky golay filter convolution coefficients for internal calculations, but for the benefit of readers I will simply post the final coefficients for a 5 tap FIR filter.

-0.191667  -0.016667   0.200000   0.416667   0.591667

These coefficients are for a "single" smooth. Run the output from a function using these through the same function to get the "double" smooth.

Testing on a sinewave looks like this:-

where the cyan, green and blue filters are the original FIR filter with 2.5 bar lag, a 5 bar SMA and Ehler's super smooth (see code here) applied to sinewave "price" for comparative purposes. The red filter is my "double_smooth" and the magenta is a Jurik Moving Average using an Octave adaptation of code that is/was freely available on the Tradingview website. The parameters for this Jurik average (length, phase and power) were chosen to match, as closely as possible, those of the double_smooth for an apples to apples comparison.

I will not discuss this any further in this post other than to say I intend to combine this with the ideas contained in my new use for kalman filter post.

More in due course.

Friday, 27 September 2024

Discontinuation of Oanda's OrderBook and PositionBook Endpoints via the V20 Framework

Longtime readers of this blog are almost certainly aware that over the last few years I have posted several times about Oanda's OrderBook and PositionBook data and what can be done with it. My first post was back in February 2022 where I posited the idea of using this data as a sentiment indicator, whilst my most recent post, March 2024, talked about substituting the data into standard, volume based indicators. In between these two dates I blogged about using the data as features for machine learning (here and here), different methods of plotting it (here with example trade and here) and an improved, associated optimisation method here.

Researching and posting about this has been interesting and I was quietly confident that there was some real value to be found doing this. However, I have recently been unpleasantly surprised and disappointed to learn (by way of my API cronjob downloading routines suddenly failing) that Oanda has decided to no longer make available the ability to download this data via their V20 API Framework. So, at a stroke, all of the above work has suddenly become redundant and effectively useless for back testing purposes or for future trading purposes. 

Did I say I was disappointed? Well, that understates it somewhat! I have written to Oanda to express my displeasure at this recent change and perhaps, fingers crossed, they will reinstate this V20 functionality.

Tuesday, 18 June 2024

Downloading Dukascopy Tick Data with Node Library

As part of my investigations into forex news trading I have found it necessary to obtain forex tick level data for back testing purposes and below I provide code to achieve this using Dukascopy's Node library, being called from Octave and using some system calls. A useful youtube video about the Dukascopy Node library will give readers some background information.
function [ first_days , last_days ] = first_and_last_weekday_of_month( y )
  t1 = datenum( [ y , 1 , 1 , 0 , 0 , 0 ] ) ;
  t2 = datenum( [ y , 12 , 31 , 0 , 0 , 0 ] ) ;
  t  = datevec( t1 : t2 ) ;
  delete_ix = strcmp( 'Saturday' , datestr( t , 'dddd' ) ) ; % find all Saturdays
  t( delete_ix , : ) = [] ;
  delete_ix = strcmp( 'Sunday' , datestr( t , 'dddd' ) ) ; % find all Sundays
  t( delete_ix , : ) = [] ;
  first_day_ix = find( diff( [ 1 ; t( : , 2 ) ] ) > 0 ) ;
  first_days = [ t( 1 , : ) ; t( first_day_ix , : ) ] ;
  last_day_ix = first_day_ix - 1 ; last_day_ix( last_day_ix <= 0 ) = [] ;
  last_days = [ t( last_day_ix , : ) ; t( end , : ) ] ;
endfunction

ii_vec = [ 2020 , 2021 , 2022 , 2023 , 2024 ] ;

for ii = ii_vec

[ first_days , last_days ] = first_and_last_weekday_of_month( ii ) ;

  for jj = 1 : 12
  cd /path/to/folder ;
  command = [ 'npx dukascopy-node -i eurusd -from ' , datestr( first_days( jj , : ) , 29 ) , ' -to ' , datestr( last_days( jj , : ) , 29 ) , ...
               ' --timeframe tick --format csv --retries 5 --directory eurusd --date-format "YYYY-MM-DD HH:mm:ss:SSS" ' ] ;
  system( command ) ;

  cd /path/to/folder/eurusd ;

  ## delete header
  command = [ "sed -i '/timestamp,askPrice,bidPrice/d' eurusd-tick-" , datestr( first_days( jj , : ) , 29 ) , "-" , datestr( last_days( jj , : ) , 29 ) , ".csv" ] ;
  system( command ) ;

  FID = fopen( 'eurusd-tick-2015-10-02-2015-10-03.csv' , 'r' ) ;
  FID = fopen( [ 'eurusd-tick-' , datestr( first_days( jj , : ) , 29 ) , '-' , datestr( last_days( jj , : ) , 29 ) , '.csv' ] , 'r' ) ;
  sizeM = [ 9 , Inf ] ;
  M = fscanf( FID , "%4d-%2d-%2d %2d:%2d:%2d:%3d,%f,%f" , sizeM )' ;
  fclose( FID ) ;

  save( '-binary' , [ 'eurusd-' , num2str( ii ) , '-' , num2str( first_days( jj , 2 ) ) , '.bin' ] , 'M' ) ;
  delete( 'eurusd-tick-*' ) ;

  clear -x ii_vec ii jj first_days last_days first_and_last_weekday_of_month

  endfor ## jj loop

clear -x ii_vec ii first_and_last_weekday_of_month

endfor ## ii_vec

The result of running the above code results in a folder full of tick data saved in Octave's native binary format, one file per month, with each file's name being descriptive of the data contained within.

I hope readers will find this useful.

Friday, 24 May 2024

Using Oanda's API to Place Entry Orders

Since my last post about end of initial testing I have been working on Oanda API functions in Octave to programmatically place entry orders and associated take profit and stop orders for a future possible forex news trading system. The reason for this is simple - it would be next to impossible to manually place a series of entry orders in the last few moments before a news release, so this would have to be done automatically. To this end, I have spent the last few weeks writing a few simple entry functions and testing them in my live trading account with the minimum trading size, i.e. buying and selling 1 Euro in the EURUSD forex pair and observing the subsequent lines at the entry/stop/take profit levels that appear on the live web platform.

The basic schema for this is shown in the following code box, where it can be seen that

body = jsonencode( struct( 'order' , struct( 'units' , num2str( 1 ) , ...
                                              'instrument' , 'EUR_USD' , ...
                                              'timeInForce' , 'FOK' , ...
                                              'type' , 'MARKET' , ...
                                              'trailingStopLossOnFill' , struct( 'distance' , num2str( trail_distance ) , ...
                                                                                  'timeInForce' , 'GTC' , ...
                                                                                  'triggerCondition' , 'MID' ) , ...
                                              'positionFill' , 'DEFAULT' ) ) )

account_header = ['curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer TOKEN"'] ;

submit_order = [ account_header , ' "https://api-fxtrade.oanda.com/v3/accounts/ACCOUNT/orders"' , ' -d ' , "'" , body , "'" ] ;

[ ~ , ret_JSON ] = system( submit_order , RETURN_OUTPUT = 'TRUE' ) ;

a JSON object containing the order details is created, HTML headers with account information are added, and then the order is dispatched via a system call to the cURL library.

A more complete Octave function example is shown next. This is a buy on a stop entry function which also sets a stop loss and take profit target level on being filled, and there is also some basic input checking.

function [ ret_JSON ] = buy_stop_entry_with_stoploss_and_takeprofit( cross , no_of_units , entry_price_level , stop_level , take_profit_level )

## some basic checks
if ( entry_price_level <= stop_level )
   error( 'Stop Level is not below Entry Level.' ) ;
endif

if ( entry_price_level >= take_profit_level )
   error( 'Take Profit Level is not above Entry Level.' ) ;
endif

account_header = ['curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer TOKEN"'] ;

body = jsonencode( struct( 'order' , struct( 'type' , 'STOP' , ...
                                              'instrument' , toupper( cross ) , ...
                                              'units' , num2str( abs( no_of_units ) ) , ...
                                              'price' , num2str( entry_price_level ) , ...
                                              'stopLossOnFill' , struct( 'price' , num2str( stop_level ) , ...
                                                                         'timeInForce' , 'GTC' ) , ...
                                              'takeProfitOnFill' , struct( 'price' , num2str( take_profit_level ) ) , ...
                                              'timeInForce' , 'GTC' , ...
                                              'triggerCondition' , 'MID' , ...
                                              'positionFill' , 'DEFAULT' ) ) ) ;

submit_order = [ account_header , ' -d ' , "'" , body , "'" , ' "https://api-fxtrade.oanda.com/v3/accounts/ACCOUNT/orders"' ] ;

[ ~ , ret_JSON ] = system( submit_order , RETURN_OUTPUT = 'TRUE' ) ;

ret_JSON = jsondecode( ret_JSON ) ;

endfunction

I won't spend much time explaining the contents of the JSON body as readers can find more information about this in Oanda's online documentation, however, there is one important thing I would note here and that is the key/value pair

 'triggerCondition' , 'MID'

The 'default' value for this is the bid/ask price for sells/buys which, in the case of a news trading system, could be problematic because the spread may very well be widened prior to a news release and trigger an entry without the underlying price actually having moved to the entry level, or even before the news is released. By setting the trigger condition to 'MID' a trade will be entered when the mid-price hits the entry level. The trade-off in this choice is summarised thus:

  • if the 'default' value is used, entries on "good" trades will be much closer to the entry level, on average, but at the possible expense of far more false entries and therefore losing trades, versus:
  • if the 'MID' value is used, there will possibly be fewer false entries, but at the expense of a worse entry price for "good" trades.
 This is a trade-off that will have to be investigated/tested in due course.

Saturday, 11 May 2024

End of Initial Tests of Trading Forex News Announcements

Following on from my previous post I have completed the same tests as outlined in that post on other currencies and the summary results are:

  • USD - an average of 0.38% return per trade
  • EUR - an average of 0.22% return per trade
  • GBP - an average of 0.77% return per trade
  • CHF - an average of 2.05% return per trade
  • JPY - an average of 0.18% return per trade
  • AUD - an average of 1.01% return per trade

Since these seem to be profitable across the board I deem it is worthwhile to continue investigating some sort of news trading system. However, that said, there is a huge caveat regarding fills which has recently been pointed out by Terberh Strategy in a comment to the previous post, namely the withdrawal of liquidity immediately prior to these news releases. I am not yet sure how I can account for this in future testing, and it may be that this lack of liquidity could in fact make it almost impossible to accurately test any such system without making some consequential assumptions.
 

Sunday, 28 April 2024

Initial Test of Trading Forex News Announcements

My first test of trading forex news announcements is to test the efficacy of breakouts immediately following a news announcement related to the US dollar, specifically, only the high impact news as shown on the forexfactory calendar in red. The intention would be to capture some of the profit available from the big movements resulting from surprise news or simply market manipulation around these new events.

Rather than conduct a standard back test of a specific set of entry/exit rules to produce a single equity curve and test metrics, I decided to conduct a Monte Carlo simulation of R multiple returns, given that a news breakout occurred, across the following forex major pairs with USD as one of the pair, i.e. EUR-USD, GBY-USD, USD-CHF, USD-JPY, AUD-USD, NZD-USD and USD-CAD. I assumed that all the above pairs would collectively constitute one trade in the USD with a 1% risk in total, so each pair was allocated a 1/7th of 1% risk R multiple for each and every USD news announcement. 

Whether a breakout occurred or not, the R multiple risk, and whether or not the trade would have been ultimately profitable was independently simulated thus for each of the above pairs:

  1. On the 10 minute OHLC bar close immediately prior to the time of the news announcement a simulated buy order and a sell order were placed a distance of 1x the close to close variance above and below the bar close, this variance being determined by the output of my kalman_ema function
  2. The 1/7th of 1% risk R multiple was taken as the distance between these two entry orders, with each entry order having an attached protective stop-loss at the other entry order's level.
  3. On the next 10 minute OHLC bar, assumed to be the bar that should show the reaction to the news, if neither of the entry orders are hit it is assumed that no trade would have taken place. This would be functionally equivalent to cancelling the entry orders 10 minutes after placing them if no news reaction in price occurs.
  4. If the next 10 minute bar hits both entry levels, it is assumed that a whipsaw trade would have occurred and this is booked as a -1R losing trade. In all the simulations that follow, this trade will always be a -1R loss.
  5. If neither of the conditions in 3 or 4 above are met, it follows that one of the entry levels would have been hit and not stopped out on the entry bar. In this case, the maximum favourable excursion (MFE) of the high (for long entries) or low (for short entries) over the next 24 10min OHLC bars (4 hours) is recorded in terms of its R multiple value.
  6. Simultaneously with 5 above, it is recorded whether or not at any time in this forward looking 24 bar period this trade's entry protective stop level would have been breached. 
  7. The simulation now starts: all -1R trades from step 4 are kept as -1R losing trades.
  8. All trades that are flagged as having hit the stop level from step 6 have a random 50% chance of being booked as a -1R loss. This simulates being stopped out before the MFE is reached, or alternatively, completely messing up and missing a take profit opportunity and then riding the trade to a loss.
  9. All trades flagged from step 6 that are not booked as a -1R loss in step 8 have their MFE randomly multiplied by a value on the interval 0 to 1 to simulate being stopped out with a trailing stop. This also applies to trades from step 5 that do not hit stops identified in step 6. During the simulation this averages out to all profitable trades only achieving, on average, 50% of the maximum possible profit.
  10. All the trade results are cumulated into a total R multiple profit/loss across all the forex pairs per news announcement and then a percentage return equity curve is calculated and plotted, an example of which is shown next, with a log scaled y-axis and a thousand Monte Carlo replications. 

This following chart is the accompanying drawdown chart to the above equity curves chart, expressed as a percentage drawdown of the on-going, equity curve high water mark on the y-axis.
Taking the average equity curve ending value and the nth root of the number of trades, the average expected, cumulated R multiple return per news announcement is approximately 0.38R profit per 1R risk.
 
The above was not intended to be a test of a specific rule set per se, but rather a test of whether or not attempting to trade forex news announcements could be profitable. What the above shows is that essentially random exits could be profitable exits for a news breakout system, and so the assumption must be that intelligent exits, either take profit or stop loss, coupled with breakouts would make a viable trading system.
 
More in due course. 


Friday, 19 April 2024

Trading Forex News

This post, as the title suggests, is about trading forex news releases and, incidently, is a small update to the appearances of my PositionBook chart and OrderLevels chart.

I recently came across this forexfactory post which shows how to download the underlying data for the forexfactory calendar, a screenshot of which is shown immediately below, and I thought I would look into the idea of trading around forex news

releases. If you look carefully at the screenshot you will see that at 2.30pm (CET) there was a high impact (red folder icon) news release regarding the Canadian dollar (CAD) which came in under expectations. The following OrderLevels chart
and PositionBook chart
clearly show the big move that immediately followed this news release (CAD weakness). The OrderLevels chart also shows the accumulation of sell orders (red background colour) that would have been an almost perfect take profit level for the day, whilst the PositionBook chart shows the accumulation of long positions (blue background colour) during the sideways movement that preceded the news release. These two charts both show the above mentioned appearance update resulting from the use of the b2r colormap function.
 
The following "overview" chart shows that the big move in the USDCAD forex pair was
definitely the result of this CAD weakness rather than USD strength (see the two rightmost currency strength charts).

Having finally "scratched the itch" of getting my PositionBook charts sorted out, my next project is to investigate the possibility of creating a forex news release trading methodology.
 
More in due course.