Showing posts with label intraday forex. Show all posts
Showing posts with label intraday forex. Show all posts

Wednesday, 3 September 2025

Expressing an Indicator in Neural Net Form

Recently I started investigating relative rotation graphs with a view to perhaps implementing a version of this for use on forex currency pairs. The underlying idea of a relative rotation graph is to plot an asset's relative strength compared to a benchmark and the momentum of this relative strength and to plot this in a form similar to a Polar coordinate system plot, which rotates around a zero point representing zero relative strength and zero relative strength momentum. After some thought it occurred to me that rather than using relative strength against a benchmark I could use the underlying relative strengths of the individual currencies, as calculated by my currency strength indicator, against each other. Furthermore, these underlying log changes in the individual currencies can be normalised using the ideas of brownian motion and then averaged together over different look back periods to create a unique indicator.

This indicator was coded up in the "traditional" way that will be familiar to anybody who has ever tried coding a trading indicator in any coding language. However, I thought that if the calculation of this indicator could be expressed in the form of a Feedforward neural network then the optimisation opportunities available using backpropagation and regularization could be used to tweek the indicator in more useful ways than just varying a look back length and amount of averaging. After some work I was able to make this work in just these two lines of Octave code:

indicator_middle_layer = tanh( full_feature_matrix * input_weights ) ;
indicator_nn_output = tanh( indicator_middle_layer * output_weights ) ; 

Of course, prior to calling these two lines of code, there is some feature engineering to create the input full_feature_matrix, and the input weights and output_weights matrices taken together are mathematically equivalent to the original indicator calculations. Finally, because this is a neural net expression of the indicator, the non-linear tanh activation function is applied to the hidden middle and output layers of the net.

The following plot shows the original indicator in black and the neural net version of it in blue 

over the data shown in this plot of 10 minute bars of the EURUSD forex pair. 

The red indicator in the plot above is the 1 bar momentum of the neural net indicator plot.

To judge the quality of this indicator I used the entropy measure (measured over 14,000+ 10 minute bars of EURUSD), the results of which are shown next.

entropy_indicator_original = 0.9485
entropy_indicator_nn_output = 0.9933

An entropy reading of 0.9933 is probably as good as any trading indicator could hope to achieve (a perfect reading is 1.0) and so the next thing was to quickly back test the indicator performance. Based on the logic of the indicator the obvious long (short) signals are being above (below) the zeroline, or equivalently the sign of the indicator, and for good measure I also tested the sign of the momentum and some simple moving averages thereof.

The following plot shows the equity curves of this quick test where it is visually clear that the blue equity curves are "the best" when plotted in relation to the black "buy and hold" equivalent equity curve. These represent the equity curves of a 3 bar simple moving average of the 1 bar momentum of both the original formulation of the indicator and the neural net implementation. I would point out that these equity curves represent the theoretical equity resulting from trading the London session after 9:00am BST and stopping at 7:00am EST (usually about noon BST) and then starting trading again at 9:00am EST until 17:00pm EST. This schedule avoids the opening sessions (7:00 to 9:00am) in both London and New York because, from my observations of many OHLC charts such as shown above, there are frequently wild swings where price is being pushed to significant points such as volume profile clusters, accumulations of buy/sell orders etc. and in my opinion no indicator can be expected to perform well in that sort of environment. Also avoided are the hours prior to 7:00am BST, i.e. the Asian session or overnight session.    

Although these equity curves might not, at first glance, be that impressive, especially as they do not account for trading costs etc. my intent on doing these tests was to determine the configuration of a final "decision" output layer to be added to the neural net implementation of the indicator. A 3 bar simple moving average of the 1 bar momentum implies the necessity to include 4 consecutive readings of the indicator output as input to a final " decision" layer. The following simple, hand-drawn sketch shows what I mean:
A discussion of this will be the subject of my next post.

Wednesday, 11 June 2025

A Replacement for my PositionBook Charts using Tick Volumes?

At the end of my previous post I said that I would be looking into using tick volume to create a new indicator, and this post is about the work I have done on this idea.

At first I tried creating a more traditional type of indicator using tick volumes separated out into buy and sell volumes, but I quickly felt that this was not a useful investment of my time so I gave up on this idea. Instead, I have come up with a way to plot tick volume levels that are similar to my previously discussed positionbook chart type, which I was forced to give up on because Oanda have discontinued the API endpoint for downloading the required data. An example of the the new, tick volume equivalent is shown below, followed by a brief description of the methodology used to create it.

Starting with the basics, if we imagine a Doji bar, for every up (down) tick there must be a corresponding and opposing down (up) tick for the bar to open and close at the exact same price and therefore we can split the tick volume for the bar equally between buy and sell tick volumes. Similarly, if we imagine a bar that opens on the low (high) and closes on the high (low), the number of ticks within the entire bar tick range can be ascribed to buy (sell) volume and the balance divided between buy and sell.

e.g. a bar opens at the low , closes at the high, with a tick range of 10 and total tick volume of 50 then:

  • buy tick volume = 10 + ( 50 - 10)/2 = 30
  • sell tick volume = 50 - buy tick volume = 20

This idea can be generalised to the range of a candlestick body being appropriately allocated to buy or sell tick volume, with the remaining balance of the total bar volume being equally allocated to buy and sell. OK, so far so good, and nothing particularly ground breaking. For the want of explaining it in a more precise manner, using the "geometry" of a bar to allocate buy and sell volumes is something that can be found online in the formulation of more than a few indicators.

The next step step is to "smear" these buy and sell volumes equally across the whole range of the bar and then take the difference:

e.g. "smeared" buy - "smeared" sell = 30 / 10 - 20 / 10 = 1, thus allocate a tick difference value of +1 for each tick level within the 10 tick range of the bar. 

Of course, over a large (e.g. 10 minute bar) this wouldn't necessarily be very informative as it is known (my volume profile bars) that the volume is usually unevenly spread across the range of any given bar. The solution to this is to apply the above methodology to the smallest bar possible, and with Oanda the smallest possible bar download is a 5 second bar. Thus what I have done is apply the above to each 5 second bar within a given 10 minute bar period and then accumulate the buy/sell/tick difference values across the individual tick levels within the 10 minute bar. This gives tick differences values that approximate the differences between each bar's separate buy and sell volume profiles.

The final step is to volume normalise the above calculations by using the total 10 minute bar tick volume such that tick differences within bars that have higher total tick volume have a greater weight than those in low tick volume bars. This is simply done by setting the total bar volume as the numerator and the tick difference as the denominator:

e.g. a tick difference of 2 at a tick level within a bar with a total 10 tick volume will get the weight

  •  10 / 2 = 5

whilst the same tick difference in a 50 tick volume bar will get the weight

  • 50 / 2 = 25
Finally, all the above is plotted as the backgound heatmap to a candlestick chart, but with a slight twist - exponential forgetting is applied along each individual tick level within the y-axis range such that if an individual tick level only has one price bar spanning it the colour slowly fades as we move along the x-axis, whilst if this level is revisited, just as with an exponential moving average, the more recent data is accumulatively weighted more. For the above plot the exponential alpha value is set at the equivalent of a 144 bar exponential moving average, i.e. the number of 10 minute bars in a 24 hour period. Shorter moving average equivalents just increase the speed at which the forgetting takes place, leading to shorter lines extending to the right, but accentuate the differences between levels; e.g. the following is the same plot as above with the equivalent of a 14 bar exponential moving average alpha value.

Earlier in this post I alluded to the possibility of this type of tick difference chart being a replacement for my unwillingly and forcefully retired PositionBook chart type. The similarities/equivalences between the two chart types I now discuss:

With the old PositionBook (PB) chart, traders' net positions at any given level and at 20 minute snapshot frequency were explicitly given by API data download and changes between snapshots were inferred by an optimisation routine. With these new TickDifference (TD) charts, traders' net positions are inferred via the methodology described above, i.e. higher normalised tick volumes at different tick levels imply a higher, net trader positioning at these levels, and changes over time in this positioning are approximated by the exponential forgetting factor.

In terms of plotting, both in the PB and TD charts, the intensities of the colours (blue for longs and red for shorts) reflect the relative importances of long/short positioning at different levels: the greater the intensity, the greater the difference between long and short positioning.

I shall now enter into a period of observational study of the usefulness of this chart type because, as the chart is inherently visual, I can't imagine how I could effectively test it in a more traditional, back testing manner. If any reader could suggest how this more traditional approach might be done, I'm all ears.    

 

Saturday, 5 April 2025

Use of HDF5 Format, and Some Charting Improvements

Over the last few weeks/months I have found it necessary to revisit the basic infrastructure of my trading/computing set up due to increasing slowness of the various computing routines I have running.

The first issue I am now addressing is how I store my data on disc. When I first started I opted for csv text files, mostly due to my ignorance of other possibilities at the time and the fact that I could visually inspect the files to manually check things. However, for my data retrieval and use needs this is now becoming too slow and burdensome and so I have made the decision to switch over to the HDF5 format and use the hdf5oct package for Octave for my data storage and file I/O needs. This dramatically speeds up data loading and will enable me to consolidate my disparate csv text files into individual tradable instrument HDF5 files, where all the data for the said tradable instrument is contained in a single HDF5 file. This data migration is an ongoing process that will continue for a few months, with the associated changes in my workflow, rewriting of some scripts and functions, cronjobs, etc.

The next thing I have done is slightly improved the calculation methodology and plotting of my volume profile bars, and the following chart shows the new version volume profile bars for the last three, 10 minute bars on the EUR-USD forex pair for trading on Friday, 4th April 2025,

whilst this second chart shows the equivalent time period with 5 second OHLC candles and the associated tick volumes for each bar.
 
The vertical green lines on this second chart delineate the 5 second bars into the corresponding 10 minute bars in the first chart. I think it is quite easy to visually see the correspondence between the 10 minute volume profile bars and the 5 second OHLC bars. 

Another thing I also plan to do in the forthcoming weeks is to use the tick volume (as shown in the second chart above) to create a new type of indicator, but more on that in due course.

Monday, 30 December 2024

A "New" Way to Smooth Price

Below is code for an Octave compiled C++ .oct function to smooth price data.
#include "octave oct.h"
#include "octave dcolvector.h"
#include "octave dmatrix.h"
#include "GenFact.h"
#include "GramPoly.h"
#include "Weight.h"

DEFUN_DLD ( double_smooth_proj_2_5, args, nargout,
"-*- texinfo -*-\n\
@deftypefn {Function File} {} double_smooth_proj_2_5 (@var{input_vector})\n\
This function takes an input series and smooths it by projecting a 5 bar rolling linear fit\n\
3 bars into the future and using these 3 bars plus the last 3 bars of the rolling input\n\
to fit a FIR filter with a 2.5 bar lag from the last projected point, i.e. a 0.5 bar\n\
lead over the last actual rolling point in the series. This is averaged with the previously\n\
calculated such smoothed point for a theoretical zero-lagged smooth. This smooth is\n\
again smoothed as above for a smooth of the smooth, i.e. a double-smooth of the\n\
original input series. The double_smooth and smooth are the function outputs.\n\
@end deftypefn" )

{
octave_value_list retval_list ;
int nargin = args.length () ;

// check the input arguments
if ( nargin > 1 ) // there must be a single, input price vector
   {
   error ("Invalid arguments. Inputs are a single, input price vector.") ;
   return retval_list ;
   }

if ( args(0).length () < 5 )
   {
   error ("Invalid 1st argument length. Input is a price vector of length >= 5.") ;
   return retval_list ;
   }
// end of input checking

ColumnVector input = args(0).column_vector_value () ;
ColumnVector smooth = args(0).column_vector_value () ;
ColumnVector double_smooth = args(0).column_vector_value () ;

// create the fit coefficients matrix
int p = 5 ;             // the number of points in calculations
int m = ( p - 1 ) / 2 ; // value to be passed to call_GenPoly_routine_from_octfile
int n = 1 ;             // value to be passed to call_GenPoly_routine_from_octfile
int s = 0 ;             // value to be passed to call_GenPoly_routine_from_octfile

  // create matrix for fit coefficients
  Matrix fit_coefficients_matrix ( 2 * m + 1 , 2 * m + 1 ) ;
  // and assign values in loop using the Weight.h recursive Gram Polynomial C++ headers
  for ( int tt = -m ; tt < (m+1) ; tt ++ )
  {
        for ( int ii = -m ; ii < (m+1) ; ii ++ )
        {
        fit_coefficients_matrix ( ii + m , tt + m ) = Weight( ii , tt , m , n , s ) ;
        }
  }

  // create matrix for slope coefficients
  Matrix slope_coefficients_matrix ( 2 * m + 1 , 2 * m + 1 ) ;
  s = 1 ;
  // and assign values in loop using the Weight.h recursive Gram Polynomial C++ headers
  for ( int tt = -m ; tt < (m+1) ; tt ++ )
  {
        for ( int ii = -m ; ii < (m+1) ; ii ++ )
        {
        slope_coefficients_matrix ( ii + m , tt + m ) = Weight( ii , tt , m , n , s ) ;
        }
  }

  Matrix smooth_coefficients ( 1 , 5 ) ;
  // fill the smooth_coefficients matrix, smooth_coeffs = ( 9/24 ) .* fit_coeffs + ( 7/12 ) .* slope_coeffs + [ 0 ; 1/24 ; 1/8 ; 5/24 ; 1/4 ] ;
  smooth_coefficients( 0 , 0 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 0 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 0 , 4 ) ;
  smooth_coefficients( 0 , 1 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 1 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 1 , 4 ) + ( 1.0 / 24.0 ) ;
  smooth_coefficients( 0 , 2 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 2 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 2 , 4 ) + ( 1.0 / 8.0 ) ;
  smooth_coefficients( 0 , 3 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 3 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 3 , 4 ) + ( 5.0 / 24.0 ) ;
  smooth_coefficients( 0 , 4 ) = ( 9.0 / 24.0 ) * fit_coefficients_matrix( 4 , 4 ) + ( 7.0 / 12.0 ) * slope_coefficients_matrix( 4 , 4 ) + ( 1.0 / 4.0 ) ;

    for ( octave_idx_type ii (4) ; ii < args(0).length () ; ii++ )
        {

         smooth( ii ) = smooth_coefficients( 0 , 0 ) * input( ii - 4 ) + smooth_coefficients( 0 , 1 ) * input( ii - 3 ) + smooth_coefficients( 0 , 2 ) * input( ii - 2 ) +
                        smooth_coefficients( 0 , 3 ) * input( ii - 1 ) + smooth_coefficients( 0 , 4 ) * input( ii ) ;

         double_smooth( ii ) = smooth_coefficients( 0 , 0 ) * smooth( ii - 4 ) + smooth_coefficients( 0 , 1 ) * smooth( ii - 3 ) + smooth_coefficients( 0 , 2 ) * smooth( ii - 2 ) +
                               smooth_coefficients( 0 , 3 ) * smooth( ii - 1 ) + smooth_coefficients( 0 , 4 ) * smooth( ii ) ;

        }

retval_list( 0 ) = double_smooth ;
retval_list( 1 ) = smooth ;

return retval_list ;

} // end of function

Rather than describe it, I'll just paste the "help" description below:-

"This function takes an input series and smooths it by projecting a 5 bar rolling linear fit 3 bars into the future and using these 3 bars plus the last 3 bars of the rolling input to fit a FIR filter with a 2.5 bar lag from the last projected point, i.e.  a 0.5 bar lead over the last actual rolling point in the series.  This is averaged with the previously calculated such smoothed point for a theoretical zero-lagged smooth.  This smooth is again smoothed as above for a smooth of the smooth, i.e.  a double-smooth of the original input series.  The double_smooth and smooth are the function outputs."

The above function calls previous code of mine to calculate savitzky golay filter convolution coefficients for internal calculations, but for the benefit of readers I will simply post the final coefficients for a 5 tap FIR filter.

-0.191667  -0.016667   0.200000   0.416667   0.591667

These coefficients are for a "single" smooth. Run the output from a function using these through the same function to get the "double" smooth.

Testing on a sinewave looks like this:-

where the cyan, green and blue filters are the original FIR filter with 2.5 bar lag, a 5 bar SMA and Ehler's super smooth (see code here) applied to sinewave "price" for comparative purposes. The red filter is my "double_smooth" and the magenta is a Jurik Moving Average using an Octave adaptation of code that is/was freely available on the Tradingview website. The parameters for this Jurik average (length, phase and power) were chosen to match, as closely as possible, those of the double_smooth for an apples to apples comparison.

I will not discuss this any further in this post other than to say I intend to combine this with the ideas contained in my new use for kalman filter post.

More in due course.

Friday, 27 September 2024

Discontinuation of Oanda's OrderBook and PositionBook Endpoints via the V20 Framework

Longtime readers of this blog are almost certainly aware that over the last few years I have posted several times about Oanda's OrderBook and PositionBook data and what can be done with it. My first post was back in February 2022 where I posited the idea of using this data as a sentiment indicator, whilst my most recent post, March 2024, talked about substituting the data into standard, volume based indicators. In between these two dates I blogged about using the data as features for machine learning (here and here), different methods of plotting it (here with example trade and here) and an improved, associated optimisation method here.

Researching and posting about this has been interesting and I was quietly confident that there was some real value to be found doing this. However, I have recently been unpleasantly surprised and disappointed to learn (by way of my API cronjob downloading routines suddenly failing) that Oanda has decided to no longer make available the ability to download this data via their V20 API Framework. So, at a stroke, all of the above work has suddenly become redundant and effectively useless for back testing purposes or for future trading purposes. 

Did I say I was disappointed? Well, that understates it somewhat! I have written to Oanda to express my displeasure at this recent change and perhaps, fingers crossed, they will reinstate this V20 functionality.

Tuesday, 18 June 2024

Downloading Dukascopy Tick Data with Node Library

As part of my investigations into forex news trading I have found it necessary to obtain forex tick level data for back testing purposes and below I provide code to achieve this using Dukascopy's Node library, being called from Octave and using some system calls. A useful youtube video about the Dukascopy Node library will give readers some background information.
function [ first_days , last_days ] = first_and_last_weekday_of_month( y )
  t1 = datenum( [ y , 1 , 1 , 0 , 0 , 0 ] ) ;
  t2 = datenum( [ y , 12 , 31 , 0 , 0 , 0 ] ) ;
  t  = datevec( t1 : t2 ) ;
  delete_ix = strcmp( 'Saturday' , datestr( t , 'dddd' ) ) ; % find all Saturdays
  t( delete_ix , : ) = [] ;
  delete_ix = strcmp( 'Sunday' , datestr( t , 'dddd' ) ) ; % find all Sundays
  t( delete_ix , : ) = [] ;
  first_day_ix = find( diff( [ 1 ; t( : , 2 ) ] ) > 0 ) ;
  first_days = [ t( 1 , : ) ; t( first_day_ix , : ) ] ;
  last_day_ix = first_day_ix - 1 ; last_day_ix( last_day_ix <= 0 ) = [] ;
  last_days = [ t( last_day_ix , : ) ; t( end , : ) ] ;
endfunction

ii_vec = [ 2020 , 2021 , 2022 , 2023 , 2024 ] ;

for ii = ii_vec

[ first_days , last_days ] = first_and_last_weekday_of_month( ii ) ;

  for jj = 1 : 12
  cd /path/to/folder ;
  command = [ 'npx dukascopy-node -i eurusd -from ' , datestr( first_days( jj , : ) , 29 ) , ' -to ' , datestr( last_days( jj , : ) , 29 ) , ...
               ' --timeframe tick --format csv --retries 5 --directory eurusd --date-format "YYYY-MM-DD HH:mm:ss:SSS" ' ] ;
  system( command ) ;

  cd /path/to/folder/eurusd ;

  ## delete header
  command = [ "sed -i '/timestamp,askPrice,bidPrice/d' eurusd-tick-" , datestr( first_days( jj , : ) , 29 ) , "-" , datestr( last_days( jj , : ) , 29 ) , ".csv" ] ;
  system( command ) ;

  FID = fopen( 'eurusd-tick-2015-10-02-2015-10-03.csv' , 'r' ) ;
  FID = fopen( [ 'eurusd-tick-' , datestr( first_days( jj , : ) , 29 ) , '-' , datestr( last_days( jj , : ) , 29 ) , '.csv' ] , 'r' ) ;
  sizeM = [ 9 , Inf ] ;
  M = fscanf( FID , "%4d-%2d-%2d %2d:%2d:%2d:%3d,%f,%f" , sizeM )' ;
  fclose( FID ) ;

  save( '-binary' , [ 'eurusd-' , num2str( ii ) , '-' , num2str( first_days( jj , 2 ) ) , '.bin' ] , 'M' ) ;
  delete( 'eurusd-tick-*' ) ;

  clear -x ii_vec ii jj first_days last_days first_and_last_weekday_of_month

  endfor ## jj loop

clear -x ii_vec ii first_and_last_weekday_of_month

endfor ## ii_vec

The result of running the above code results in a folder full of tick data saved in Octave's native binary format, one file per month, with each file's name being descriptive of the data contained within.

I hope readers will find this useful.

Friday, 24 May 2024

Using Oanda's API to Place Entry Orders

Since my last post about end of initial testing I have been working on Oanda API functions in Octave to programmatically place entry orders and associated take profit and stop orders for a future possible forex news trading system. The reason for this is simple - it would be next to impossible to manually place a series of entry orders in the last few moments before a news release, so this would have to be done automatically. To this end, I have spent the last few weeks writing a few simple entry functions and testing them in my live trading account with the minimum trading size, i.e. buying and selling 1 Euro in the EURUSD forex pair and observing the subsequent lines at the entry/stop/take profit levels that appear on the live web platform.

The basic schema for this is shown in the following code box, where it can be seen that

body = jsonencode( struct( 'order' , struct( 'units' , num2str( 1 ) , ...
                                              'instrument' , 'EUR_USD' , ...
                                              'timeInForce' , 'FOK' , ...
                                              'type' , 'MARKET' , ...
                                              'trailingStopLossOnFill' , struct( 'distance' , num2str( trail_distance ) , ...
                                                                                  'timeInForce' , 'GTC' , ...
                                                                                  'triggerCondition' , 'MID' ) , ...
                                              'positionFill' , 'DEFAULT' ) ) )

account_header = ['curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer TOKEN"'] ;

submit_order = [ account_header , ' "https://api-fxtrade.oanda.com/v3/accounts/ACCOUNT/orders"' , ' -d ' , "'" , body , "'" ] ;

[ ~ , ret_JSON ] = system( submit_order , RETURN_OUTPUT = 'TRUE' ) ;

a JSON object containing the order details is created, HTML headers with account information are added, and then the order is dispatched via a system call to the cURL library.

A more complete Octave function example is shown next. This is a buy on a stop entry function which also sets a stop loss and take profit target level on being filled, and there is also some basic input checking.

function [ ret_JSON ] = buy_stop_entry_with_stoploss_and_takeprofit( cross , no_of_units , entry_price_level , stop_level , take_profit_level )

## some basic checks
if ( entry_price_level <= stop_level )
   error( 'Stop Level is not below Entry Level.' ) ;
endif

if ( entry_price_level >= take_profit_level )
   error( 'Take Profit Level is not above Entry Level.' ) ;
endif

account_header = ['curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer TOKEN"'] ;

body = jsonencode( struct( 'order' , struct( 'type' , 'STOP' , ...
                                              'instrument' , toupper( cross ) , ...
                                              'units' , num2str( abs( no_of_units ) ) , ...
                                              'price' , num2str( entry_price_level ) , ...
                                              'stopLossOnFill' , struct( 'price' , num2str( stop_level ) , ...
                                                                         'timeInForce' , 'GTC' ) , ...
                                              'takeProfitOnFill' , struct( 'price' , num2str( take_profit_level ) ) , ...
                                              'timeInForce' , 'GTC' , ...
                                              'triggerCondition' , 'MID' , ...
                                              'positionFill' , 'DEFAULT' ) ) ) ;

submit_order = [ account_header , ' -d ' , "'" , body , "'" , ' "https://api-fxtrade.oanda.com/v3/accounts/ACCOUNT/orders"' ] ;

[ ~ , ret_JSON ] = system( submit_order , RETURN_OUTPUT = 'TRUE' ) ;

ret_JSON = jsondecode( ret_JSON ) ;

endfunction

I won't spend much time explaining the contents of the JSON body as readers can find more information about this in Oanda's online documentation, however, there is one important thing I would note here and that is the key/value pair

 'triggerCondition' , 'MID'

The 'default' value for this is the bid/ask price for sells/buys which, in the case of a news trading system, could be problematic because the spread may very well be widened prior to a news release and trigger an entry without the underlying price actually having moved to the entry level, or even before the news is released. By setting the trigger condition to 'MID' a trade will be entered when the mid-price hits the entry level. The trade-off in this choice is summarised thus:

  • if the 'default' value is used, entries on "good" trades will be much closer to the entry level, on average, but at the possible expense of far more false entries and therefore losing trades, versus:
  • if the 'MID' value is used, there will possibly be fewer false entries, but at the expense of a worse entry price for "good" trades.
 This is a trade-off that will have to be investigated/tested in due course.

Saturday, 11 May 2024

End of Initial Tests of Trading Forex News Announcements

Following on from my previous post I have completed the same tests as outlined in that post on other currencies and the summary results are:

  • USD - an average of 0.38% return per trade
  • EUR - an average of 0.22% return per trade
  • GBP - an average of 0.77% return per trade
  • CHF - an average of 2.05% return per trade
  • JPY - an average of 0.18% return per trade
  • AUD - an average of 1.01% return per trade

Since these seem to be profitable across the board I deem it is worthwhile to continue investigating some sort of news trading system. However, that said, there is a huge caveat regarding fills which has recently been pointed out by Terberh Strategy in a comment to the previous post, namely the withdrawal of liquidity immediately prior to these news releases. I am not yet sure how I can account for this in future testing, and it may be that this lack of liquidity could in fact make it almost impossible to accurately test any such system without making some consequential assumptions.
 

Sunday, 28 April 2024

Initial Test of Trading Forex News Announcements

My first test of trading forex news announcements is to test the efficacy of breakouts immediately following a news announcement related to the US dollar, specifically, only the high impact news as shown on the forexfactory calendar in red. The intention would be to capture some of the profit available from the big movements resulting from surprise news or simply market manipulation around these new events.

Rather than conduct a standard back test of a specific set of entry/exit rules to produce a single equity curve and test metrics, I decided to conduct a Monte Carlo simulation of R multiple returns, given that a news breakout occurred, across the following forex major pairs with USD as one of the pair, i.e. EUR-USD, GBY-USD, USD-CHF, USD-JPY, AUD-USD, NZD-USD and USD-CAD. I assumed that all the above pairs would collectively constitute one trade in the USD with a 1% risk in total, so each pair was allocated a 1/7th of 1% risk R multiple for each and every USD news announcement. 

Whether a breakout occurred or not, the R multiple risk, and whether or not the trade would have been ultimately profitable was independently simulated thus for each of the above pairs:

  1. On the 10 minute OHLC bar close immediately prior to the time of the news announcement a simulated buy order and a sell order were placed a distance of 1x the close to close variance above and below the bar close, this variance being determined by the output of my kalman_ema function
  2. The 1/7th of 1% risk R multiple was taken as the distance between these two entry orders, with each entry order having an attached protective stop-loss at the other entry order's level.
  3. On the next 10 minute OHLC bar, assumed to be the bar that should show the reaction to the news, if neither of the entry orders are hit it is assumed that no trade would have taken place. This would be functionally equivalent to cancelling the entry orders 10 minutes after placing them if no news reaction in price occurs.
  4. If the next 10 minute bar hits both entry levels, it is assumed that a whipsaw trade would have occurred and this is booked as a -1R losing trade. In all the simulations that follow, this trade will always be a -1R loss.
  5. If neither of the conditions in 3 or 4 above are met, it follows that one of the entry levels would have been hit and not stopped out on the entry bar. In this case, the maximum favourable excursion (MFE) of the high (for long entries) or low (for short entries) over the next 24 10min OHLC bars (4 hours) is recorded in terms of its R multiple value.
  6. Simultaneously with 5 above, it is recorded whether or not at any time in this forward looking 24 bar period this trade's entry protective stop level would have been breached. 
  7. The simulation now starts: all -1R trades from step 4 are kept as -1R losing trades.
  8. All trades that are flagged as having hit the stop level from step 6 have a random 50% chance of being booked as a -1R loss. This simulates being stopped out before the MFE is reached, or alternatively, completely messing up and missing a take profit opportunity and then riding the trade to a loss.
  9. All trades flagged from step 6 that are not booked as a -1R loss in step 8 have their MFE randomly multiplied by a value on the interval 0 to 1 to simulate being stopped out with a trailing stop. This also applies to trades from step 5 that do not hit stops identified in step 6. During the simulation this averages out to all profitable trades only achieving, on average, 50% of the maximum possible profit.
  10. All the trade results are cumulated into a total R multiple profit/loss across all the forex pairs per news announcement and then a percentage return equity curve is calculated and plotted, an example of which is shown next, with a log scaled y-axis and a thousand Monte Carlo replications. 

This following chart is the accompanying drawdown chart to the above equity curves chart, expressed as a percentage drawdown of the on-going, equity curve high water mark on the y-axis.
Taking the average equity curve ending value and the nth root of the number of trades, the average expected, cumulated R multiple return per news announcement is approximately 0.38R profit per 1R risk.
 
The above was not intended to be a test of a specific rule set per se, but rather a test of whether or not attempting to trade forex news announcements could be profitable. What the above shows is that essentially random exits could be profitable exits for a news breakout system, and so the assumption must be that intelligent exits, either take profit or stop loss, coupled with breakouts would make a viable trading system.
 
More in due course. 


Friday, 19 April 2024

Trading Forex News

This post, as the title suggests, is about trading forex news releases and, incidently, is a small update to the appearances of my PositionBook chart and OrderLevels chart.

I recently came across this forexfactory post which shows how to download the underlying data for the forexfactory calendar, a screenshot of which is shown immediately below, and I thought I would look into the idea of trading around forex news

releases. If you look carefully at the screenshot you will see that at 2.30pm (CET) there was a high impact (red folder icon) news release regarding the Canadian dollar (CAD) which came in under expectations. The following OrderLevels chart
and PositionBook chart
clearly show the big move that immediately followed this news release (CAD weakness). The OrderLevels chart also shows the accumulation of sell orders (red background colour) that would have been an almost perfect take profit level for the day, whilst the PositionBook chart shows the accumulation of long positions (blue background colour) during the sideways movement that preceded the news release. These two charts both show the above mentioned appearance update resulting from the use of the b2r colormap function.
 
The following "overview" chart shows that the big move in the USDCAD forex pair was
definitely the result of this CAD weakness rather than USD strength (see the two rightmost currency strength charts).

Having finally "scratched the itch" of getting my PositionBook charts sorted out, my next project is to investigate the possibility of creating a forex news release trading methodology.
 
More in due course.

Tuesday, 9 April 2024

A "New" Use for Kalman Filter on Price Time Series?

During the course of writing this blog I have visited the idea of using Kalman filters several times, most recently in this February 2023 post. My motivation in these previous posts could best be described as trying to smooth price data with as little lag as possible, i.e. create a zero-lag indicator. In doing so, the model most often used for the Kalman filter was a physical motion model with position, velocity and acceleration components. Whilst these "worked" in the sense of smoothing the underlying data, it is not necessarily a good model to use on financial data because, obviously, financial data is not a physical system and so I thought I would apply the Kalman filter to something that is ubiquitously used on financial data - the Exponential moving average.

Below is an Octave function to calculate a "Kalman_ema" where the prediction part of the filter is just a linear extrapolation of an exponential moving average and then this extrapolated value is used to calculate the projected price, with the measurement being the real price and its ema value.

## Copyright (C) 2024 dekalog
##
## This program is free software: you can redistribute it and/or modify
## it under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
##
## This program is distributed in the hope that it will be useful,
## but WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
##
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see .

## -*- texinfo -*-
## @deftypefn {} {@var{retval} =} kalman_ema (@var{price}, @var{lookback})
##
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2024-04-08

function [ filter_out , P_out ] = kalman_ema ( price , lookback )

## check price is row vector
if ( size( price , 1 ) > 1 && size( price , 2 ) == 1 )
 price = price' ;
endif

P_out = zeros( numel( price ) , 2 ) ;

ema_price = ema( price , lookback )' ;
alpha = 2 / ( lookback + 1 ) ;

## initial covariance matrix
P = eye( 3 ) ;

## transistion matrix
A = [ 0 , ( 1 + alpha ) / alpha , -( 1 / alpha ) ; ...
       0 , 2 , -1 ; ...
       0 , 1 , 0 ] ;

## initial Q
Q = eye( 3 ) ;

## measurement vector
Y = [ price ; ...
       ema_price ; ...
       shift( ema_price , 1 ) ] ;
Y( 3 , 1 ) = Y( 2 , 1 ) ;

## measurement matrix
H = eye( 3 ) ;

## measurement noise covariance
R = eye( 3 ) ;

## container for Kalman filter output
filter_out = zeros( size( Y ) ) ;
errors = ones( 3 , 1 ) ;

for ii = 2 : size( Y , 2 )

  X = A * filter_out( : , ii - 1 ) ;
  P = A * P * A' + Q ;

  errors = alpha .* abs( X - Y( : , ii ) ) + ( 1 - alpha ) .* errors ;

  IM = H * X ; ## Mean of predictive distribution
  IS = ( R + H * P * H' ) ; ## Covariance of predictive mean
  K = P * H' / IS ; ## Computed Kalman gain
  X = X + K * ( Y( : , ii ) - IM ) ; ## Updated state mean
  P = P - K * IS * K' ; ## Updated state covariance

  filter_out( : , ii ) = X ;
  P_out( ii , 1 ) = P( 1 , 1 ) ;
  P_out( ii , 2 ) = P( 2 , 2 ) ;

  ## update Q and R
  Q( 1 , 1 ) = errors( 1 ) ; Q( 2 , 2 ) = errors( 2 ) ; Q( 3 , 3 ) = errors( 3 ) ;
  R = Q ;

endfor

filter_out = filter_out' ;

endfunction
Using this for smoothing either the price or the ema has no utility, but a by-product of the filter is the Covariance matrix from which it is possible to plot bands around the price series. The following chart shows the bands for various ema alpha values corresponding to various Fibonacci sequence length look backs for the ema alpha value.  
This next chart, for purposes of clarity, shows the "Golden Cross" lengths of 50 and 200
and this final chart shows an adaptive look back length which is a function of the instantaneous measured period (see here or here) of the underlying data.

Despite the wide range of the input look back lengths for the ema, it can be seen that the covariance bands around price are broadly similar. I can think of many uses for such a data driven but basically parameter insensitive measure of price variance, e.g. entry/exit levels, stop levels, position sizing etc.

More in due course.


Thursday, 21 March 2024

Standard "Volume based" Indicators Replaced with PositionBook Data

In my previous post I suggested three different approaches to using PositionBook data other than directly using this data to create new, unique indicators. This post is about the first of the aforementioned ideas: modifying existing indicators that somehow incorporate volume in their construction.

The indicators I've chosen to look at are the Accumulation and Distribution index, On Balance Volume, Money Flow index, Price Volume Trend and, for comparative purposes, an indicator similar to these utilising PositionBook data. For illustrative purposes these indicators have been calculated on this OHLC data,

which shows a 20 minute bar chart of the EUR_USD forex pair. The chart starts at the New York close of 4 January 2024 and ends at the New York close on 5 January 2024. The green vertical lines span 7am to 9am London time and the red lines are 7am to 9am New York time. This second chart shows the indicators individually max-min scaled from zero to one so that they can be more easily visually compared.

 
As in the OHLC chart, the vertical lines are the London and New York opening sessions. The four "traditional" indicators more or less tell the same story, which is not surprising since their calculations involve bar to bar price changes or open to close intra-bar changes which are then multiplied by bar volume. Effectively they are all just differently scaled versions of the underlying price movement, or alternatively, just accumulated sums of different fractions of the bar volume. The PositionBook data version, called Pos Change Ind, does not use any OHLC information at all but rather uses the accumulated difference(s) between position changes multiplied by volume. For most of the day the general story told by the Pos Change Ind indicator agrees with the other indicators; however during the big run up which started just about 9am New York time there is a significant difference between Pos Change Ind and the others.

In hindsight, by looking at my order levels chart

and volume profile chart
 
it is easy to speculate about what market participants were thinking during this trading day, especially if the following PositionBook chart is taken into account.
 
For the purpose of the following brief "stream of consciousness" narrative imagine it's 7am New York time and looking back at the day's action so far it can be seen that the downward drift of the day seems to have halted with a mini double bottom, and we are now moving up with some new heavy tick volumes having accumulated over the last hour or so, forming a new volume profile point of control (POC). Over the next hour prices continue the new slight drift up with accumulating long positions and at about 8.30am we see a doji bar form on the 10 minute chart at the level of the rolling vwap for the day. Suddenly there is the big down bar, which could conceivably be a shake-out of the recently added longs, targeting the stop orders below, which finishes with an extended lower wick. This seems to be an ideal set-up for a long trade targeting either the old POC, which also happens to be the current high of the day, or the accumulated orders which happen to coincide with the level at which, currently, the greatest proportion of long positions have been entered. 
 
Of course it can seen, in hindsight, that this was a great set-up for an intra-day trade that could have caught almost the entire high-low range of the day as a profitable trade, dependent of course on exact entry and exit levels. This set-up is a synthesis of observations from the volume profile chart, the order levels chart and the position levels chart, along with the vwap indicator. The Pos Change Ind indicator does not seem to add much value over that provided by the more traditional, volume based indicators in the set-up phase.

This is not necessarily the case for the exit. It can be seen that the Pos Change Ind indicator turns down sharply several bars before all the other indicators, and this movement in the indicator is evident by the close of the bar with the long upper wick which makes the high of the day. This sharp downturn in the indicator shows that there was a mass exit of longs during the formation of this bar, made clearer by the following chart which shows the two components of the Pos Change Indicator, namely the
 
"Outside Change" and the "Inside Change." The outside change shows the total net position changes for the price levels that lie outside the range of the bar and the inside change is the net change for price levels that lie within the bar range. The greater change of the two is obviously the (red) inside change, and looking at the position levels plot we can see why. The previously mentioned level of "greatest proportion of long positions" suddenly loses that distinction - a large number of the longs at this level obviously liquidated their positions. This is important information, which shows that sentiment favouring long positions obviously changed, and it can be surmised that many long position holders were happy to get out of their trade at more or less break even prices. Also noticeable in the position levels chart is the change in blue shade from darker to lighter at the price levels within the range of the large price run-up. This reduction in colour intensity shows that those traders who entered during the run-up also exited near the top of the move. Taken together these observations could have been used as a nice short set-up targeting, for example, the then currently lower level of vwap, which in fact was subsequently hit with the day closing at this level.

As I had previously suspected, there is value in PositionBook data but it is perhaps tricky to operationalise or to easily automate within a trading system. It can be used to indicate a directional bias, or as above to show when traders exit positions. Again, as shown above, it can be used to put a new, useful twist on existing indicators, but in general it appears that use of this data is primarily visual by way of my PositionBook chart and subsequent, subjective evaluation. Whilst I am pleased with the potential insights provided, I would prefer a more structured, algorithmic use of this data, as in the third point of my previous post.
 
More in due course.


Wednesday, 28 February 2024

Indicator(s) Derived from PositionBook Data

Since my last post I have been trying to create new indicators from PositionBook data but unfortunately I have had no luck in doing so. I have have tried differences, ratios, cumulative sums, logs and control charts to no avail and I have decided to discontinue this line of investigation because it doesn't seem to hold much promise. The only other direct uses I can think of for this data are:

I am not yet sure which of the above I will look at next, but whichever it is will be the subject of a future post.

Wednesday, 22 November 2023

Update to PositionBook Chart - Revised Optimisation Method

Just over a year ago I previewed a new chart type which I called a "PositionBook Chart" and gave examples in this post and this one. These first examples were based on an optimisation routine over 6 variables using Octave's fminunc function, an unconstrained minimisation routine. However, I was not 100% convinced that the model I was using for the loss/cost function was realistic, and so since the above posts I have been further testing different models to see if I could come up with a more satisfactory model and optimisation routine. The comparison between the original model and the better, newer model I have selected is indicated in the following animated GIF, which shows the last few day's action in the GBPUSD forex pair. 

The old model is figure(200), with the darker blue "blob" of positions accumulated at the lower, beginning of the chart, and the newer model, figure(900), shows accumulation throughout the uptrend. The reasons I prefer this newer model are:

  • 4 of the 6 variables mentioned above (longs above and below price bar range, and shorts above and below price bar range) are theoretically linked to each other to preserve their mutual relationships and jointly minimised over a single input to the loss/cost function, which has a bounded upper and lower limit. This means I can use Octave's fminbnd function instead of fminunc. The minimisation objective is the minimum absolute change in positions outside the price bar range, which has a real world relevance as compared to the mean squared error of the fminunc cost function.
  • because fminunc is "unconstrained" occasionally it would converge to unrealistic solutions with respect to position changes outside the price bar range. This does not happen with the new routine.
  • once the results of fminbnd are obtained, it is possible to mathematically calculate the position changes within the price bar range exactly, without needing to resort to any optimisation routine. This gives a zero error for the change which is arguably the most important.
  • the results from the new routine seem to be more stable in that indicators I am trying to create from them are noticeably less erratic and confusing than those created from fminunc results.
  • finally, fminbnd over 1 variable is much quicker to converge than fminunc over 6 variables.
The second last mentioned point, derived indicators, will be the subject of my next post.

Sunday, 20 August 2023

Currency Strength Revisited

Recently I responded to a Quantitative Finance forum question here, where I invited the questioner to peruse certain posts on this blog. Apparently the posts do not provide enough information to fully answer the question (my bad) and therefore this post provides what I think will suffice as a full and complete reply, although perhaps not scientifically rigorous.

The original question asked was "Is it possible to separate or decouple the two currencies in a trading pair?" and I believe what I have previously described as a "currency strength indicator" does precisely this (blog search term ---> https://dekalogblog.blogspot.com/search?q=currency+strength+indicator). This post outlines the rationale behind my approach.

Take, for example, the GBPUSD forex pair, and further give it a current (imaginary) value of 1.2500. What does this mean? Of course it means 1 GBP will currently buy you 1.25 USD, or alternatively 1 USD will buy you 1/1.25 = 0.8 GBP. Now rather than write GBPUSD let's express GBPUSD as a ratio thus:- GBP/USD, which expresses the idea of "how many USD are there in a GBP?" in the same way that 9/3 shows how many 3s there are in 9. Now let's imagine at some time period later there is a new pair value, a lower case "gbp/usd" where we can write the relationship

                    (1)     ( GBP / USD ) * ( G / U ) = gbp / usd

to show the change over the time period in question. The ( G / U ) term is a multiplicative term to show the change in value from old GBP/USD 1.2500 to say new value gbp/usd of 1.2600, 

e.g.                ( G / U ) == ( gbp / usd ) / ( GBP / USD ) == 1.26 / 1.25 == 1.008

from which it is clear that the forex pair has increased by 0.8% in value over this time period. Now, if we imagine that over this time period the underlying, real value of USD has remained unchanged this is equivalent to setting the value U in ( G / U ) to exactly 1, thereby implying that the 0.8% increase in the forex pair value is entirely attributable to a 0.8% increase in the underlying, real value of GBP, i.e. G == 1.008. Alternatively, we can assume that the value of GBP remains unchanged,

 e.g.                G == 1, which means that U == 1 / 1.008 == 0.9921

which implies that a ( 1 - 0.9921 ) == 0.79% decrease in USD value is responsible for the 0.8% increase in the pair quote.

Of course, given only equation (1) it is impossible to solve for G and U as either can be arbitrarily set to any number greater than zero and then be compensated for by setting the other number such that the constant ( G / U ) will match the required constant to account for the change in the pair value.

However, now let's introduce two other forex pairs (2) and (3) and thus we have:-

                    (1)     ( GBP / USD ) * ( G / U ) = gbp / usd

                    (2)     ( EUR / USD ) * ( E / U ) = eur / usd

                    (3)     ( EUR / GBP ) * ( E / G ) = eur / gbp

We now have three equations and three unknowns, namely G, E and U, and so this system of equations could be laboriously, mathematically solved by substitution. 

However, in my currency strength indicator I have taken a different approach. Instead of solving mathematically I have written an error function which takes as arguments a list of G, E, U, ... etc. for all currency multipliers relevant to all the forex quotes I have access to, approximately 47 various crosses which themselves are inputs to the error function, and this function is supplied to Octave's fminunc function to simultaneously solve for all G, E, U, ... etc. given all forex market quotes. The initial starting values for all G, E, U, ... etc. are 1, implying no change in values across the market. These starting values consistently converge to the same final values for G, E, U, ... etc for each separate period's optimisation iterations.

Having got all G, E, U, ... etc. what can be done? Well, taking G for example, we can write

                    (4)     GBP * G = gbp

for the underlying, real change in the value of GBP. Dividing each side of (4) by GBP and taking logs we get

                    (5)     log( G ) = log( gbp / GBP )

i.e. the log of the fminunc returned value for the multiplicative constant G is the equivalent of the log return of GBP independent of all other currencies, or as the original forum question asked, the (change in) value of GBP separated or decoupled the from the pair in which it is quoted.

Of course, having the individual log returns of separated or decoupled currencies, there are many things that can be done with them, such as:-

  • create indices for each currency
  • apply technical analysis to these separate indices
  • intermarket currency analysis
  • input to machine learning (ML) models
  • possibly create new and unique currency indicators

Examples of the creation of "alternative price charts" and indices are shown below

where the black line is the actual 10 minute closing prices of GBPUSD over the last week (13th to 18th August) with the corresponding GBP price (blue line) being the "alternative" GBPUSD chart if U is held at 1 in the ( G / U ) term and G allowed to be its derived, optimised value, and the USD price (red line) being the alternative chart if G is held at 1 and U allowed to be its derived, optimised value.

This second chart shows a more "traditional" index like chart

where the starting values are 1 and both the G and U values take their derived values. As can be seen, over the week there was upwards momentum in both the GBP and USD, with the greater momentum being in the GBP resulting in a higher GBPUSD quote at the end of the week. If, in the second chart the blue GBP line had been flat at a value of 1 all week, the upwards momentum in USD would have resulted in a lower week ending quoted value of GBPUSD, as seen in the red USD line in the first chart. Having access to these real, decoupled returns allows one to see through the given, quoted forex prices in the manner of viewing the market as though through X-ray vision. 

I hope readers find this post enlightening, and if you find some other uses for this idea, I would be interested in hearing how you use it.
 

Tuesday, 28 February 2023

Kalman Filter and Sensor Fusion.

In the Spring of 2012 and again in the Spring of 2019 I posted a series of posts about the Kalman Filter, which readers can access via the blog archive on the right. In both cases I eventually gave up those particular lines of investigation because of disappointing results. This post is the first in a new series about using the Kalman Filter for sensor fusion, which I had known of before, but due to the paucity of clear information about this online I had never really investigated. However, my recent discovery of this Github and its associated online tutorial has inspired me to a third attempt at using Kalman Filters. What I am going to attempt to do is use the idea of sensor fusion to fuse the output of several functions I have coded in the past, which each extract the dominant cycle from a time series, to hopefully obtain a better representation of the "true underlying cycle."

The first step in this process is to determine the measurement noise covariance or, in Kalman Filter terms, the "R" covariance matrix. To do this, I have used the average of two of the outputs from the above mentioned functions to create a new cycle and similarly used two extracted trends (price minus these cycles) averaged to get a new trend. The new cycle and new trend are simply added to each other to create a new price series which is almost identical to the original price series. The screenshot below shows a typical cycle extract,

where the red cycle is the average of the other two extracted cycles, and this following screenshot shows the new trend in red plus the new price alongside the old price (blue and black respectively).
Having created a time series thus with known trend and cycle, it is a simple matter to run my cycle extractor functions on this new price, compare the outputs with the known cyclic component of price and calculate the variance of the errors to get the R covariance matrices for 14 different currency crosses.

More in due course.

 

Friday, 18 November 2022

PositionBook Chart Example Trade

As a quick follow up to my previous post I thought I'd show an example of how one could possibly use my new PositionBook chart as a trade set-up. Below is the USD_CHF forex pair for the last two days

showing the nice run-up yesterday and then the narrow range of Friday's Asian session.

The tentative set-up idea is to look for such a narrow range and use the colour of the PositionBook chart in this range (blue for a long) to catch or anticipate a breakout. The take profit target would be the resistance suggested by the horizontal yellow bar in the open orders chart (overhead sell orders) more or less at Thursday's high.

I decided to take a really small punt on this idea but took a small loss of 0.0046 GBP
as indicated in the above Oanda trade app. I entered too soon and perhaps should have waited for confirmation (I can see a doji bar on the 5 minute chart just after my stop out) or had the conviction to re-enter the trade after this doji bar. The initial trade idea seems to have been sound as the profit target was eventually hit. This could have been a nice 4/5/6 R-multiple profitable trade.😞

Friday, 11 November 2022

A New PositionBook Chart Type

It has been almost 6 months since I last posted, due to working on a house renovation. However, I have still been thinking about/working on stuff, particularly on analysis of open position ratios. I had tried using this data as features for machine learning, but my thinking has evolved somewhat and I have reduced my ambition/expectation for this type of data.

Before I get into this I'd like to mention Trader Dale (I have no affiliation with him) as I have recently been following his volume profile set-ups, a screenshot of one being shown below.

This shows recent Wednesday action in the EUR_GBP pair on a 30 minute chart. The flexible volume profile set-up Trader Dale describes is called a Volume Accumulation Set-up which occurs immediately prior to a big break (in this case up). The whole premise of this particular set-up is that the volume accumulation area will be future support, off of which price will bounce, as shown by the "hand drawn" lines. Below is shown my version of the above chart
with a bit of extra price action included. The horizontal yellow lines show the support area.

Now here is the same data, but in what I'm calling a PositionBook chart, which uses Oanda's Position Level data downloaded via their API.

The blue (red) horizontal lines show the levels at which traders are net long (short) in terms of positions actually entered/held. The brighter the colours the greater the difference between the longs/shorts. It is obvious that the volume accumulation set-up area is showing a net accumulation of long positions and this is an indication of the direction of the anticipated breakout long before it happens. The Trader Dale set-up presumes an accumulation of longs because of the resultant breakout direction and doesn't seem to provide an opportunity to participate in the breakout itself!

The next chart shows the action of the following day and a bit where the price does indeed come back down to the "support" area but doesn't result in an immediate bounce off the support level. The following order level chart perhaps shows why there was no bounce - the relative absence of open orders at that level.

The equivalent PositionBook chart, including a bit more price action,
shows that after price fails to bounce off the support level it does recover back into it and then even more long positions are accumulated (the darker blue shade) at the support level during the London open, again allowing one to position oneself for the ensuing rise during the London morning session, followed by another long accumulation during the New York opening session for a following leg up into the London close (the last vertical red line).

This purpose of this post is not to criticise the Trader Dale set-up but rather to highlight the potential value-add of these new PositionBook charts. They seem to hold promise for indicating price direction and I intend to continue investigating/improving them in the coming weeks.

More in due course.