Showing posts sorted by relevance for query volume profile. Sort by date Show all posts
Showing posts sorted by relevance for query volume profile. Sort by date Show all posts

Friday, 27 August 2021

Another Iterative Improvement of my Volume/Market Profile Charts

Below is a screenshot of this new chart version, of today's (Friday's) price action at a 10 minute bar scale:

Just by looking at the chart it might not be obvious to readers what has changed, so the changes are detailed below.

The first change is in how the volume profile (the horizontal histogram on the left) is calculated. The "old" version of the chart calculates the profile by assuming the "model" that tick volume for each 10 minute bar is normally distributed across the high/low range of the bar, and then the profile histogram is the accumulation of these individual, 10 minute, normally distributed "mini profiles." A more complete description of this is given in my Market Profile Chart in Octave blog post, with code.

The new approach is more data centric rather than model based. Every 10 minutes, instead of downloading the 10 minute OHLC and tick volume, the last 10 minutes worth of 5 second OHLC and tick volume is downloaded. The whole tick volume of each 5 second period is assigned to a price level equivalent to the Typical price (rounded to the nearest pip) of said 5 second period, and the volume profile is then the accumulation of these volume ticks per price level. I think this is a much more accurate reflection of the price levels at which tick volume actually occurred compared to the old, model based charts. This second screenshot is of the old chart over the exact same price data as the first, improved version of the chart.

It can be seen that the two volume profile histograms of the respective charts differ from each other in terms of their overall shape and the number and price levels of peaks (Points of Control) and troughs (Low Volume Nodes).

The second change in the new chart is in how the background heatmap is plotted. The heatmap is a different presentation of the volume profile whereby higher volume price levels are shown by the brighter yellow colours. The old chart only displays the heatmap associated with the latest calculated volume profile histogram, which is projected back in time. This is, of course, a form of lookahead bias when plotting past prices over the latest heatmap. The new chart solves this by plotting a "rolling" version of the heatmap which reflects the volume profile that was in force at the time each 10 minute OHLC candle formed. It is easy to see how the Points of Control and Low Volume Nodes price levels ebb and flow throughout the trading day.

The third change, which naturally followed on from the downloading of 5 second data, is in the plotting of the candlesticks. Rather than having a normal, open to close candlestick body, the candlesticks show the "mini volume profiles" of the tick volume within each bar, plotted via Octave's patch function. The white candlestick wicks indicate the usual high/low range, and the open and close levels are shown by grey and black dots respectively. This is more clearly seen in the zoomed in screenshot below.

I wanted to plot these types of bars because recently I have watched some trading webcasts, which talked about "P", "b" and "D" shaped bar profiles at "areas of interest." The upshot of these webcasts is that, in general, a "P" bar is bullish, a "b" is bearish and a "D" is "in balance" when they intersect an "area of interest" such as Point of Control, Low Volume Node, support and resistance etc. This is supposed to be indicative of future price direction over the immediate short term. With this new version of chart, I shall be in a position to investigate these claims for myself.

Saturday, 16 May 2020

A Comparison of Charts

Earlier in May I posted about Market Profile with some charts and video. Further work on this has made me realise that my earlier post should more accurately be described as Volume Profile, so apologies to readers for that.

Another, similar type of chart I have seen described as a TPO chart (TPO stands for 'That Price Occurred' or ticked) and it is a simple matter to extend the code in the above linked post to create a TPO chart and below is the Octave function I have written to produce the backgrounds for both types of plot
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{background} =} pcolor_background (@var{y_ax}, @var{high}, @var{low}, @var{vol}, @var{tick_size})
##
## Creates a matrix, BACKGOUND, to be used by Market Profile plotting functions,
## which need a colour background matrix to be plotted by pcolor.
## 
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-13

function [ vp_background , mp_background ] = pcolor_background ( y_ax , high , low , vol , tick_size )

vp_z = zeros( 1 , numel( y_ax ) ) ; mp_z = vp_z ;
vol( vol <= 1 ) = 2 ; ## no single point vol distributions

for ii = 1 : numel( high )

## the volume profile, vp_background
ticks = norminv( linspace(0,1,vol(ii)+2) , (high(ii) + low(ii))/2 , (high(ii) - low(ii))*0.25 ) ;
ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ 0.5 ) .* tick_size ;
unique_ticks = unique( ticks ) ;

if ( numel( unique_ticks ) > 1 )
[ N , X ] = hist( ticks , unique( ticks ) ) ;
[ ~ , N_ix ] = max( N ) ; tick_ix = X( N_ix ) ;
[ ~ , centre_tick ] = min( abs( y_ax .- tick_ix ) ) ;
vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)) = vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)).+ N ;
elseif ( numel( unique_ticks ) == 1 )
[ ~ , centre_tick ] = min( abs( y_ax .- unique_ticks ) ) ;
vp_z( 1 , centre_tick ) = vp_z( 1 , centre_tick ) + vol( ii ) ;
endif

## the market profile, mp_background
[ ~ , ix_high ] = min( abs( y_ax .- high( ii ) ) ) ;
[ ~ , ix_low ] = min( abs( y_ax .- low( ii ) ) ) ;
mp_z( 1 , ix_low : ix_high ) = mp_z( 1 , ix_low : ix_high ) .+ 1 ;

endfor

vp_background = repmat( vp_z , numel( high ) , 1 ) ;
mp_background = repmat( mp_z , numel( high ) , 1 ) ;

endfunction
I have elected to still call the TPO plot a Market Profile plot as, from what I can make out, the tick count of the TPO is intended to be a surrogate for the original, cleared volume of Market Profile.

The above function is intended to provide a matrix input for the pcolor function, which internally scales the matrix to 0-1. Another idea I have had is to multiply the Volume Profile matrix and the Market Profile matrix together to get a normalised Combined Profile matrix. The animated GIF below shows all three.
It can be seen that there are subtle differences between them but that, on the whole, the results are similar.

More in due course.

Wednesday, 11 June 2025

A Replacement for my PositionBook Charts using Tick Volumes?

At the end of my previous post I said that I would be looking into using tick volume to create a new indicator, and this post is about the work I have done on this idea.

At first I tried creating a more traditional type of indicator using tick volumes separated out into buy and sell volumes, but I quickly felt that this was not a useful investment of my time so I gave up on this idea. Instead, I have come up with a way to plot tick volume levels that are similar to my previously discussed positionbook chart type, which I was forced to give up on because Oanda have discontinued the API endpoint for downloading the required data. An example of the the new, tick volume equivalent is shown below, followed by a brief description of the methodology used to create it.

Starting with the basics, if we imagine a Doji bar, for every up (down) tick there must be a corresponding and opposing down (up) tick for the bar to open and close at the exact same price and therefore we can split the tick volume for the bar equally between buy and sell tick volumes. Similarly, if we imagine a bar that opens on the low (high) and closes on the high (low), the number of ticks within the entire bar tick range can be ascribed to buy (sell) volume and the balance divided between buy and sell.

e.g. a bar opens at the low , closes at the high, with a tick range of 10 and total tick volume of 50 then:

  • buy tick volume = 10 + ( 50 - 10)/2 = 30
  • sell tick volume = 50 - buy tick volume = 20

This idea can be generalised to the range of a candlestick body being appropriately allocated to buy or sell tick volume, with the remaining balance of the total bar volume being equally allocated to buy and sell. OK, so far so good, and nothing particularly ground breaking. For the want of explaining it in a more precise manner, using the "geometry" of a bar to allocate buy and sell volumes is something that can be found online in the formulation of more than a few indicators.

The next step step is to "smear" these buy and sell volumes equally across the whole range of the bar and then take the difference:

e.g. "smeared" buy - "smeared" sell = 30 / 10 - 20 / 10 = 1, thus allocate a tick difference value of +1 for each tick level within the 10 tick range of the bar. 

Of course, over a large (e.g. 10 minute bar) this wouldn't necessarily be very informative as it is known (my volume profile bars) that the volume is usually unevenly spread across the range of any given bar. The solution to this is to apply the above methodology to the smallest bar possible, and with Oanda the smallest possible bar download is a 5 second bar. Thus what I have done is apply the above to each 5 second bar within a given 10 minute bar period and then accumulate the buy/sell/tick difference values across the individual tick levels within the 10 minute bar. This gives tick differences values that approximate the differences between each bar's separate buy and sell volume profiles.

The final step is to volume normalise the above calculations by using the total 10 minute bar tick volume such that tick differences within bars that have higher total tick volume have a greater weight than those in low tick volume bars. This is simply done by setting the total bar volume as the numerator and the tick difference as the denominator:

e.g. a tick difference of 2 at a tick level within a bar with a total 10 tick volume will get the weight

  •  10 / 2 = 5

whilst the same tick difference in a 50 tick volume bar will get the weight

  • 50 / 2 = 25
Finally, all the above is plotted as the backgound heatmap to a candlestick chart, but with a slight twist - exponential forgetting is applied along each individual tick level within the y-axis range such that if an individual tick level only has one price bar spanning it the colour slowly fades as we move along the x-axis, whilst if this level is revisited, just as with an exponential moving average, the more recent data is accumulatively weighted more. For the above plot the exponential alpha value is set at the equivalent of a 144 bar exponential moving average, i.e. the number of 10 minute bars in a 24 hour period. Shorter moving average equivalents just increase the speed at which the forgetting takes place, leading to shorter lines extending to the right, but accentuate the differences between levels; e.g. the following is the same plot as above with the equivalent of a 14 bar exponential moving average alpha value.

Earlier in this post I alluded to the possibility of this type of tick difference chart being a replacement for my unwillingly and forcefully retired PositionBook chart type. The similarities/equivalences between the two chart types I now discuss:

With the old PositionBook (PB) chart, traders' net positions at any given level and at 20 minute snapshot frequency were explicitly given by API data download and changes between snapshots were inferred by an optimisation routine. With these new TickDifference (TD) charts, traders' net positions are inferred via the methodology described above, i.e. higher normalised tick volumes at different tick levels imply a higher, net trader positioning at these levels, and changes over time in this positioning are approximated by the exponential forgetting factor.

In terms of plotting, both in the PB and TD charts, the intensities of the colours (blue for longs and red for shorts) reflect the relative importances of long/short positioning at different levels: the greater the intensity, the greater the difference between long and short positioning.

I shall now enter into a period of observational study of the usefulness of this chart type because, as the chart is inherently visual, I can't imagine how I could effectively test it in a more traditional, back testing manner. If any reader could suggest how this more traditional approach might be done, I'm all ears.    

 

Thursday, 21 March 2024

Standard "Volume based" Indicators Replaced with PositionBook Data

In my previous post I suggested three different approaches to using PositionBook data other than directly using this data to create new, unique indicators. This post is about the first of the aforementioned ideas: modifying existing indicators that somehow incorporate volume in their construction.

The indicators I've chosen to look at are the Accumulation and Distribution index, On Balance Volume, Money Flow index, Price Volume Trend and, for comparative purposes, an indicator similar to these utilising PositionBook data. For illustrative purposes these indicators have been calculated on this OHLC data,

which shows a 20 minute bar chart of the EUR_USD forex pair. The chart starts at the New York close of 4 January 2024 and ends at the New York close on 5 January 2024. The green vertical lines span 7am to 9am London time and the red lines are 7am to 9am New York time. This second chart shows the indicators individually max-min scaled from zero to one so that they can be more easily visually compared.

 
As in the OHLC chart, the vertical lines are the London and New York opening sessions. The four "traditional" indicators more or less tell the same story, which is not surprising since their calculations involve bar to bar price changes or open to close intra-bar changes which are then multiplied by bar volume. Effectively they are all just differently scaled versions of the underlying price movement, or alternatively, just accumulated sums of different fractions of the bar volume. The PositionBook data version, called Pos Change Ind, does not use any OHLC information at all but rather uses the accumulated difference(s) between position changes multiplied by volume. For most of the day the general story told by the Pos Change Ind indicator agrees with the other indicators; however during the big run up which started just about 9am New York time there is a significant difference between Pos Change Ind and the others.

In hindsight, by looking at my order levels chart

and volume profile chart
 
it is easy to speculate about what market participants were thinking during this trading day, especially if the following PositionBook chart is taken into account.
 
For the purpose of the following brief "stream of consciousness" narrative imagine it's 7am New York time and looking back at the day's action so far it can be seen that the downward drift of the day seems to have halted with a mini double bottom, and we are now moving up with some new heavy tick volumes having accumulated over the last hour or so, forming a new volume profile point of control (POC). Over the next hour prices continue the new slight drift up with accumulating long positions and at about 8.30am we see a doji bar form on the 10 minute chart at the level of the rolling vwap for the day. Suddenly there is the big down bar, which could conceivably be a shake-out of the recently added longs, targeting the stop orders below, which finishes with an extended lower wick. This seems to be an ideal set-up for a long trade targeting either the old POC, which also happens to be the current high of the day, or the accumulated orders which happen to coincide with the level at which, currently, the greatest proportion of long positions have been entered. 
 
Of course it can seen, in hindsight, that this was a great set-up for an intra-day trade that could have caught almost the entire high-low range of the day as a profitable trade, dependent of course on exact entry and exit levels. This set-up is a synthesis of observations from the volume profile chart, the order levels chart and the position levels chart, along with the vwap indicator. The Pos Change Ind indicator does not seem to add much value over that provided by the more traditional, volume based indicators in the set-up phase.

This is not necessarily the case for the exit. It can be seen that the Pos Change Ind indicator turns down sharply several bars before all the other indicators, and this movement in the indicator is evident by the close of the bar with the long upper wick which makes the high of the day. This sharp downturn in the indicator shows that there was a mass exit of longs during the formation of this bar, made clearer by the following chart which shows the two components of the Pos Change Indicator, namely the
 
"Outside Change" and the "Inside Change." The outside change shows the total net position changes for the price levels that lie outside the range of the bar and the inside change is the net change for price levels that lie within the bar range. The greater change of the two is obviously the (red) inside change, and looking at the position levels plot we can see why. The previously mentioned level of "greatest proportion of long positions" suddenly loses that distinction - a large number of the longs at this level obviously liquidated their positions. This is important information, which shows that sentiment favouring long positions obviously changed, and it can be surmised that many long position holders were happy to get out of their trade at more or less break even prices. Also noticeable in the position levels chart is the change in blue shade from darker to lighter at the price levels within the range of the large price run-up. This reduction in colour intensity shows that those traders who entered during the run-up also exited near the top of the move. Taken together these observations could have been used as a nice short set-up targeting, for example, the then currently lower level of vwap, which in fact was subsequently hit with the day closing at this level.

As I had previously suspected, there is value in PositionBook data but it is perhaps tricky to operationalise or to easily automate within a trading system. It can be used to indicate a directional bias, or as above to show when traders exit positions. Again, as shown above, it can be used to put a new, useful twist on existing indicators, but in general it appears that use of this data is primarily visual by way of my PositionBook chart and subsequent, subjective evaluation. Whilst I am pleased with the potential insights provided, I would prefer a more structured, algorithmic use of this data, as in the third point of my previous post.
 
More in due course.


Saturday, 5 April 2025

Use of HDF5 Format, and Some Charting Improvements

Over the last few weeks/months I have found it necessary to revisit the basic infrastructure of my trading/computing set up due to increasing slowness of the various computing routines I have running.

The first issue I am now addressing is how I store my data on disc. When I first started I opted for csv text files, mostly due to my ignorance of other possibilities at the time and the fact that I could visually inspect the files to manually check things. However, for my data retrieval and use needs this is now becoming too slow and burdensome and so I have made the decision to switch over to the HDF5 format and use the hdf5oct package for Octave for my data storage and file I/O needs. This dramatically speeds up data loading and will enable me to consolidate my disparate csv text files into individual tradable instrument HDF5 files, where all the data for the said tradable instrument is contained in a single HDF5 file. This data migration is an ongoing process that will continue for a few months, with the associated changes in my workflow, rewriting of some scripts and functions, cronjobs, etc.

The next thing I have done is slightly improved the calculation methodology and plotting of my volume profile bars, and the following chart shows the new version volume profile bars for the last three, 10 minute bars on the EUR-USD forex pair for trading on Friday, 4th April 2025,

whilst this second chart shows the equivalent time period with 5 second OHLC candles and the associated tick volumes for each bar.
 
The vertical green lines on this second chart delineate the 5 second bars into the corresponding 10 minute bars in the first chart. I think it is quite easy to visually see the correspondence between the 10 minute volume profile bars and the 5 second OHLC bars. 

Another thing I also plan to do in the forthcoming weeks is to use the tick volume (as shown in the second chart above) to create a new type of indicator, but more on that in due course.

Monday, 5 July 2021

Market Profile Low Volume Node Chart

As a diversion to my recent work with Matrix Profile I have recently completed work on a new chart type in Octave, namely a Market Profile Low Volume Node (LVN) chart, two slightly different versions of which are shown below.

This first one is derived from a TPO chart, whilst the next
is derived from a Volume profile chart.

The horizontal lines are drawn at levels which are considered to be "lows" in the underlying, but not shown, TPO/Volume profiles. The yellow lines are "stronger lows" than the green lines, and the blue lines are extensions of the previous day's "strong lows" in force at the end of that day's trading.

The point of all this, according to online guru theory, is that price is expected to be "rejected" at LVNs by either bouncing, a la support or resistance, or by price powering through the LVN level, usually on increased volume. The charts show the rolling development of the LVNs as the underlying profiles change throughout the day, hence lines can appear and disappear and change colour. As this is a new avenue of investigation for me I feel it is too soon to make a comment on these lines' efficacy, but it does seem uncanny how price very often seems to react to these levels.

More in due course.

Sunday, 3 May 2020

Market Profile Chart in Octave

In a comment on my previous post, visualising Oanda's orderbook, a reader called Darren suggested that I was over complicating things and should perhaps use a more established methodology, namely Market Profile.

I had heard of Market Profile before Darren mentioned it, but had always assumed that it required access to data that I didn't readily have to hand, i.e. tick level data. With my recent work on Oanda's API in Octave that is no longer necessarily the case. However, downloading, storing and manipulating streams of tick data would be a whole new infrastructure project that I would have to implement in either R or Octave.

Instead of doing this I have done some research into Market Profile and come up with an alternative solution that can use the more readily available tick volume. One of the empirically observed assumptions of Market Profile is that on a "normal" day such volume is normally distributed and creates a "value area" that contains approximately 70% of the market action, which roughly corresponds to action falling within one standard deviation of the mean of said action, and this mean in turn roughly corresponds with what is termed the "point of control" (POC).

If one takes this at face value as being an accurate description of market action, it is possible to recreate the "normal" market profile with the following Octave code:
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 [ vals , bin_centres ] = hist( ticks , unique( ticks ) ) ;
What this does is create vol(ii)+2 linearly spaced tick values from 0 to 1, where vol(ii) is the tick volume for an aggregated period, i.e. an ohlc bar, and transforms these into normally distributed ticks with a mean of the midpoint of the bar and an assumed standard deviation of one sixth the high-low range, rounded to the nearest whole tick. The hist function then provides the counts of ticks per level (vals) at levels (bin_centres).

Below is a screen shot of recent EUR_USD forex prices at a resolution of 20 minute candlesticks from 17:00 EST on 28th April 2020 to end of week close at 17:00 EST 1st May 2020.
The silhouette chart at the bottom is the usual tick volume per bar and the horizontal histogram is the Market Profile of the 20 minute bars from the first bar to the first vertical green line, calculated as described above. All the visable vertical green lines represent the open at 07:00 BST, whilst the vertical red lines are the 17:00 EST closes. The horizontal blue line is the current POC at 07:00 BST, taking into account only the bars to the left of the first green line, i.e. the Asian overnight session.

Next is a video of the progression through time along the above chart: as time progesses the Market Profile histogram changes and new, blue POC lines are plotted, with the time progression being marked by the advancing green lines. During subsequent Asian sessions the histogram colour is plotted in red, and new POC lines formed in the Asian session are also plotted in red.


For easier viewing, this is a screen shot of the chart as it appears at the end of the video
For comparative purposes this is a screen shot of the same as above, but using 10 minute ohlc bars and 10 minute updates to the Market Profile
Readers should note that the scaling of the silhouette charts and histograms are not the same for both - they are hand scaled by me for visualisation purposes only.

For completeness, here is the Octave script used to produce the above
clear all ;
pkg load statistics ;

## load data
cd /home/dekalog/Documents/octave/oanda_data/20m ;
oanda_files_20m = glob( "*_ohlc_20m" ) ;
ix = 7 ;##input( 'Tradable? ' ) ;
data = dlmread( oanda_files_20m{ ix } ) ;
data( 1 : 146835 , : ) = [] ;

tick_size = 0.0001 ;

open = data( : , 18 ) ; high = data( : , 19 ) ; low = data( : , 20 ) ; close = data( : , 21 ) ; vol = data( : , 22 ) ;
## Create grid for day
max_high = max( high ) + 0.001 ; min_low = min( low ) - 0.001 ; grid = ( min_low : tick_size : max_high + 0.0001 ) ;
grid_ix = floor( grid ./ tick_size .+ tick_size ) .* tick_size ; 
market_profile = [ grid_ix ; zeros( 1 , size( grid_ix , 2 ) ) ] ;
asian_market_profile = [ grid_ix ; zeros( 1 , size( grid_ix , 2 ) ) ] ;
 
figure( 20 ) ; 
candle( high , low , close , open ) ; 
vline( 27 , 'g' ) ; vline( 72 , 'r' ) ; vline( 99 , 'g' ) ; vline( 144 , 'r' ) ; vline( 174 , 'g' ) ;
xlim( [ 0 size( open , 1 ) ] ) ;
ylim( [ grid_ix(1) grid_ix(end) ] ) ;
hold on ; plot( ( vol .* 0.0000004 ) .+ grid_ix( 1 ) , 'b' , 'linewidth' , 2 ) ; 
area( ( vol .* 0.0000004 ) .+ grid_ix( 1 ) , 'facecolor' , 'b' ) ; hold off ;

for ii = 1 : 27
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 [ vals , bin_centres ] = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , bin_centres ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
endfor

[ max_mp_val_old , max_mp_ix ] = max( market_profile( 2 , : ) ) ;

hold on ; figure( 20 ) ; H = barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ; hold off ;
hline( market_profile( 1 , max_mp_ix ) , 'b' ) ;

for ii = 28 : 72
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 vals = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , unique( ticks ) ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
 [ max_mp_val , max_mp_ix ] = max( market_profile( 2 , : ) ) ;
 hold on ; figure( 20 ) ; barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ; hold off ;
 vline( ii , 'g' ) ; 
 if ( max_mp_val > max_mp_val_old )
  hline( market_profile( 1 , max_mp_ix ) , 'b' ) ;
  max_mp_val_old = max_mp_val ;
 endif
 pause(0.01) ;
endfor

for ii = 73 : 99
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 vals = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , unique( ticks ) ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
 asian_market_profile( 2 , vals_ix ) += vals ;
 [ max_mp_val , max_mp_ix ] = max( market_profile( 2 , : ) ) ;
 hold on ; figure( 20 ) ; barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ; 
 figure( 20 ) ; barh( asian_market_profile( 1 , : ) , asian_market_profile( 2 , : ).*0.005 , 'r' ) ;
 hold off ;
 vline( ii , 'g' ) ; 
 if ( max_mp_val > max_mp_val_old )
  hline( market_profile( 1 , max_mp_ix ) , 'b' ) ;
  max_mp_val_old = max_mp_val ;
 endif
 pause(0.01) ;
endfor

[ ~ , max_mp_ix ] = max( asian_market_profile( 2 , : ) ) ;
hline( asian_market_profile( 1 , max_mp_ix ) , 'r' ) ;

for ii = 100 : 144
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 vals = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , unique( ticks ) ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
 [ max_mp_val , max_mp_ix ] = max( market_profile( 2 , : ) ) ;
 hold on ; figure( 20 ) ; barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ;
 figure( 20 ) ; barh( asian_market_profile( 1 , : ) , asian_market_profile( 2 , : ).*0.005 , 'r' ) ; 
 hold off ;
 vline( ii , 'g' ) ; 
 if ( max_mp_val > max_mp_val_old )
  hline( market_profile( 1 , max_mp_ix ) , 'b' ) ;
  max_mp_val_old = max_mp_val ;
 endif
 pause(0.01) ;
endfor

[ max_mp_val , max_mp_ix ] = max( market_profile( 2 , 101 : end ) ) ;
max_mp_val_old = max_mp_val ;
hline( market_profile( 1 , max_mp_ix + 100 ) , 'b' ) ;

for ii = 145 : 174
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 vals = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , unique( ticks ) ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
 asian_market_profile( 2 , vals_ix ) += vals ;
 [ max_mp_val , max_mp_ix ] = max( market_profile( 2 , 101 : end ) ) ;
 hold on ; figure( 20 ) ; barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ; 
 figure( 20 ) ; barh( asian_market_profile( 1 , : ) , asian_market_profile( 2 , : ).*0.005 , 'r' ) ;
 hold off ;
 vline( ii , 'g' ) ; 
 if ( max_mp_val > max_mp_val_old )
  hline( market_profile( 1 , max_mp_ix + 100 ) , 'b' ) ;
  max_mp_val_old = max_mp_val ;
 endif
 pause(0.01) ;
endfor

[ ~ , max_mp_ix ] = max( asian_market_profile( 2 , 101 : end ) ) ;
hline( asian_market_profile( 1 , max_mp_ix + 100 ) , 'r' ) ;

for ii = 175 : size( open , 1 )
 ticks = norminv( linspace( 0 , 1 , vol( ii ) + 2 ) , ( high( ii ) + low( ii ) ) / 2 , ( high( ii ) - low( ii ) ) / 6 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ tick_size ) .* tick_size ;
 vals = hist( ticks , unique( ticks ) ) ;
 vals_ix = find( ismember( grid_ix , unique( ticks ) ) ) ;
 market_profile( 2 , vals_ix ) += vals ;
 [ max_mp_val , max_mp_ix ] = max( market_profile( 2 , 101 : end ) ) ;
 hold on ; figure( 20 ) ; barh( market_profile( 1 , : ) , market_profile( 2 , : ).*0.005 , 'c' ) ; 
 figure( 20 ) ; barh( asian_market_profile( 1 , : ) , asian_market_profile( 2 , : ).*0.005 , 'r' ) ;
 hold off ;
 vline( ii , 'g' ) ; 
 if ( max_mp_val > max_mp_val_old )
  hline( market_profile( 1 , max_mp_ix + 100 ) , 'b' ) ;
  max_mp_val_old = max_mp_val ;
 endif
 pause(0.01) ;
endfor
As just noted above for the scaling of the charts/video, readers should also be aware that within this script there are a lot of magic numbers that are unique to the data and scaling being used; therefore, this is not a plug in and play video script.

My thanks to the reader Darren, who suggested that I look into Market Profile. More in due course.

Wednesday, 26 May 2021

Update on Recent Matrix Profile Work

Since my previous post, on Matrix Profile (MP), I have been doing a lot of online reading about MP and going back to various source papers and code that are available at the UCR Matrix Profile page. I have been doing this because, despite my initial enthusiasm, the R tsmp package didn't turn out to be suitable for what I wanted to do, or perhaps more correctly I couldn't hack it to get the sort of results I wanted, hence my need to go to "first principles" and code from the UCR page.

Readers may recall that my motivation was to look for time series motifs that form "initial balance (IB)" set ups of Market Profile charts. The rationale for this is that different IBs are precursors to specific market tendencies which may provide a clue or an edge in subsequent market action. A typical scenario from the literature on Market Profile might be "an Open Test Drive can often indicate one of the day's extremes." If this is actually true, one could go long/short with a high confidence stop at the identified extreme. Below is a screenshot of some typical IB profiles:

where each letter typically represents a 30 minute period of market action. The problem is that Market Profile charts, to me at least, are inherently visual and therefore do not easily lend themselves to an algorithmic treatment, which makes it difficult to back test in a robust fashion. This is why I have been trying to use MP.

The first challenge I faced was how to preprocess price action data such as OHLC and volume such that I could use MP. In the end I resorted to using the mid-price, the high-low range and (tick) volume as proxies for market direction, market volatility and market participation. Because IBs occur over market opens, I felt it was important to use the volatility and participation proxies as these are important markers for the sentiment of subsequent price action. This choice necessitated using a multivariate form of MP, and I used the basic MP STAMP code that is available at Matrix Profile VI: Meaningful Multidimensional Motif Discovery, with some slight tweaks for my use case.

Having the above tools in hand, what should they be used for? I decided that Cluster analysis is what is needed, i.e. cluster using the motifs that MP could discover. For this purpose, I used the approach outlined in section 3.9 of the paper "The Swiss Army Knife of Time Series Data Mining." The reasoning behind this choice is that if, for example, an "Open Test Drive IB" is a real thing, it should occur frequently enough that time series sub-sequences of it can be clustered or associated with an "Open Test Drive IB" motif. If all such prototype motifs can be identified and all IBs can be assigned to one of them, subsequent price action can be investigated to check the anecdotal claims, such as quoted above.

My Octave code implementation of the linked Swiss Army Knife routine is shown in the code box below.

data = dlmread( '/path/to/mv_data' ) ;
skip_loc = dlmread( '/path/to/skip_loc' ) ;
skip_loc_copy = find( skip_loc ) ; skip_loc_copy2 = skip_loc_copy ; skip_loc_copy3 = skip_loc_copy ;
sub_len = 9 ;
data_len = size( data , 1 ) ;
data_to_use = [ (data(:,2).+data(:,3))./2 , data(:,2).-data(:,3) , data(:,5) ] ;

must_dim = [] ;
exc_dim = [] ;
[ pro_mul , pro_idx , data_freq , data_mu , data_sig ] = multivariate_stamp( data_to_use, sub_len, must_dim, exc_dim, skip_loc ) ;
original_single_MP = pro_mul( : , 1 ) ; ## just mid price
original_single_MP2 = original_single_MP .+ pro_mul( : , 2 ) ; ## mid price and hi-lo range
original_single_MP3 = original_single_MP2 .+ pro_mul( : , 3 ) ; ## mid price, hi-lo range and volume

## Swiss Army Knife Clustering
RelMP = original_single_MP ; RelMP2 = original_single_MP2 ; RelMP3 = original_single_MP3 ;
DissMP = inf( length( RelMP ) , 1 ) ; DissMP2 = DissMP ; DissMP3 = DissMP ; 
minValStore = [] ; minIdxStore = [] ; minValStore2 = [] ; minIdxStore2 = [] ; minValStore3 = [] ; minIdxStore3 = [] ;
## set up a recording matrix 
all_dist_pro = zeros( size( RelMP , 1 ) , size( data_to_use , 2 ) ) ;

for ii = 1 : 500
## reset recording matrix for this ii loop  
all_dist_pro( : , : ) = 0 ;

## just mid price
[ minVal , minIdx ] = min( RelMP ) ;
minValStore = [ minValStore ; minVal ] ; minIdxStore = [ minIdxStore ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 1 ) = real( dist_pro ) ;
JMP = all_dist_pro( : , 1 ) ;
DissMP = min( DissMP , JMP ) ; ## dismiss all motifs discovered so far
RelMP = original_single_MP ./ DissMP ;
skip_loc_copy = unique( [ skip_loc_copy ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP( skip_loc_copy ) = 1 ;

## mid price and hi-lo range
[ minVal , minIdx ] = min( RelMP2 ) ;
minValStore2 = [ minValStore2 ; minVal ] ; minIdxStore2 = [ minIdxStore2 ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 2 ) = real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,2), DissmissRange(:,2), data_len, sub_len, data_mu(:,2), data_sig(:,2), data_mu(minIdx,2), data_sig(minIdx,2) ) ;
all_dist_pro( : , 2 ) = all_dist_pro( : , 2 ) .+ real( dist_pro ) ;
JMP2 = all_dist_pro( : , 2 ) ;
DissMP2 = min( DissMP2 , JMP2 ) ; ## dismiss all motifs discovered so far
RelMP2 = original_single_MP2 ./ DissMP2 ;
skip_loc_copy2 = unique( [ skip_loc_copy2 ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP2( skip_loc_copy2 ) = 1 ;

## mid price, hi-lo range and volume
[ minVal , minIdx ] = min( RelMP3 ) ;
minValStore3 = [ minValStore3 ; minVal ] ; minIdxStore3 = [ minIdxStore3 ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 3 ) = real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,2), DissmissRange(:,2), data_len, sub_len, data_mu(:,2), data_sig(:,2), data_mu(minIdx,2), data_sig(minIdx,2) ) ;
all_dist_pro( : , 3 ) = all_dist_pro( : , 3 ) .+ real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,3), DissmissRange(:,3), data_len, sub_len, data_mu(:,3), data_sig(:,3), data_mu(minIdx,3), data_sig(minIdx,3) ) ;
all_dist_pro( : , 3 ) = all_dist_pro( : , 3 ) .+ real( dist_pro ) ;
JMP3 = all_dist_pro( : , 3 ) ;
DissMP3 = min( DissMP3 , JMP3 ) ; ## dismiss all motifs discovered so far
RelMP3 = original_single_MP3 ./ DissMP3 ;
skip_loc_copy3 = unique( [ skip_loc_copy3 ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP3( skip_loc_copy3 ) = 1 ;

endfor ## end ii loop

There are a few things to note about this code:

  • the use of a skip_loc vector 
  • a sub_len value of 9
  • 3 different calculations for DissMP and RelMP vectors

i) The skip_loc vector is a vector of time series indices (Idx) for which the MP and possible cluster motifs should not be calculated to avoid identifying motifs from data sequences that do not occur in the underlying data due to the way I concatenated it during pre-processing, i.e. 7am to 9am, 7am to 9am, ... etc.

ii) sub_len value of 9 means 9 x 10 minute OHLC bars, to match the 30 minute A, B and C of the above IB screenshot.

iii)  3 different calculations because different combinations of the underlying data are used. 

This last part probably needs more explanation. A multivariate RelMP is created by adding together individual dist_pros (distance profiles), and the cluster motif identification is achieved by finding minimums in the RelMP; however, a minimum in a multivariate RelMP is generally a different minimum to the minimums of the individual, univariate RelMPs. What my code does is use a univariate RelMP of the mid price, and 2 multivariate RelMPs of mid price plus high-low range and mid price, high-low range and volume. This gives 3 sets of minValues and minValueIdxs, one for each set of data. The idea is to run the ii loop for, e.g. 500 iterations, and to then identify possible "robust" IB cluster motifs by using the Octave intersect function to get the minIdx that are common to all 3 sets of Idx data. 

By way of example, setting the ii loop iteration to just 100 results in only one intersect Idx value on some EUR_USD forex data, the plot of which is shown below:

Comparing this with the IB screenshot above, I would say this represents a typical "Open Auction" process with prices rotating upwards/downwards with no real conviction either way, with a possible long breakout on the last bar or alternatively, a last upwards test before a price plunge.

My intent is to use the above methodology to get a set of candidate IB motifs upon which a clustering algorithm can be based. This clustering algorithm will be the subject of my next post.

Friday, 11 November 2022

A New PositionBook Chart Type

It has been almost 6 months since I last posted, due to working on a house renovation. However, I have still been thinking about/working on stuff, particularly on analysis of open position ratios. I had tried using this data as features for machine learning, but my thinking has evolved somewhat and I have reduced my ambition/expectation for this type of data.

Before I get into this I'd like to mention Trader Dale (I have no affiliation with him) as I have recently been following his volume profile set-ups, a screenshot of one being shown below.

This shows recent Wednesday action in the EUR_GBP pair on a 30 minute chart. The flexible volume profile set-up Trader Dale describes is called a Volume Accumulation Set-up which occurs immediately prior to a big break (in this case up). The whole premise of this particular set-up is that the volume accumulation area will be future support, off of which price will bounce, as shown by the "hand drawn" lines. Below is shown my version of the above chart
with a bit of extra price action included. The horizontal yellow lines show the support area.

Now here is the same data, but in what I'm calling a PositionBook chart, which uses Oanda's Position Level data downloaded via their API.

The blue (red) horizontal lines show the levels at which traders are net long (short) in terms of positions actually entered/held. The brighter the colours the greater the difference between the longs/shorts. It is obvious that the volume accumulation set-up area is showing a net accumulation of long positions and this is an indication of the direction of the anticipated breakout long before it happens. The Trader Dale set-up presumes an accumulation of longs because of the resultant breakout direction and doesn't seem to provide an opportunity to participate in the breakout itself!

The next chart shows the action of the following day and a bit where the price does indeed come back down to the "support" area but doesn't result in an immediate bounce off the support level. The following order level chart perhaps shows why there was no bounce - the relative absence of open orders at that level.

The equivalent PositionBook chart, including a bit more price action,
shows that after price fails to bounce off the support level it does recover back into it and then even more long positions are accumulated (the darker blue shade) at the support level during the London open, again allowing one to position oneself for the ensuing rise during the London morning session, followed by another long accumulation during the New York opening session for a following leg up into the London close (the last vertical red line).

This purpose of this post is not to criticise the Trader Dale set-up but rather to highlight the potential value-add of these new PositionBook charts. They seem to hold promise for indicating price direction and I intend to continue investigating/improving them in the coming weeks.

More in due course.

Friday, 26 March 2021

Market/Volume Profile and Matrix Profile

A quick preview of what I am currently working on: using Matrix Profile to search for time series motifs, using the R tsmp package. The exact motifs I'm looking for are the various "initial balance" set ups of Market Profile charts. 

To do so, I'm concentrating the investigation around both the London and New York opening times, with a custom annotation vector (av). Below is a simple R function to set up this custom av, which is produced separately in Octave and then loaded into R.

mp_adjusted_by_custom_av <- function( mp_object , custom_av ){
## https://stackoverflow.com/questions/66726578/custom-annotation-vector-with-tsmp-r-package
mp_object$av <- custom_av
class( mp_object ) <- tsmp:::update_class( class( mp_object ) , "AnnotationVector" )
mp_adjusted_by_custom_av <- tsmp::av_apply( mp_object )
return( mp_adjusted_by_custom_av )
}
This animated GIF shows plots of short, exemplar adjusted market profile objects highlighting the London only, New York only and combined results of the relevant annotation vectors.
This is currently a work in progress and so I shall report results in due course.

Friday, 5 February 2021

A Forex Pair Snapshot Chart

After yesterday's Heatmap Plot of Forex Temporal Clustering post I thought I would consolidate all the chart types I have recently created into one easy, snapshot overview type of chart. Below is a typical example of such a chart, this being today's 10 minute EUR_USD forex pair chart up to a few hours after the London session close (the red vertical line).


The top left chart is a Market/Volume Profile Chart with added rolling Value Area upper and lower bounds (the cyan, red and white lines) and also rolling Volume Weighted Average Price with upper and lower standard deviation lines (magenta).

The bottom left chart is the turning point heatmap chart as described in yesterday's post.

The two rightmost charts are also Market/Volume Profile charts, but of my Currency Strength Candlestick Charts based on my Currency Strength Indicator. The upper one is the base currency, i.e. EUR, and the lower is the quote currency. 

The following charts are the same day's charts for:

GBP_USD,

USD_CHF
and finally USD_JPY
The regularity of the turning points is easily seen in the lower lefthand charts although, of course, this is to be expected as they all share the USD as a common currency. However, there are also subtle differences to be seen in the "shadows" of the lighter areas.

For the nearest future my self-assigned task will be to observe the forex pairs, in real time, through the prism of the above style of chart and do some mental paper trading, and perhaps some really small size, discretionary live trading, in additional to my normal routine of research and development.


Wednesday, 28 February 2024

Indicator(s) Derived from PositionBook Data

Since my last post I have been trying to create new indicators from PositionBook data but unfortunately I have had no luck in doing so. I have have tried differences, ratios, cumulative sums, logs and control charts to no avail and I have decided to discontinue this line of investigation because it doesn't seem to hold much promise. The only other direct uses I can think of for this data are:

I am not yet sure which of the above I will look at next, but whichever it is will be the subject of a future post.

Wednesday, 20 May 2020

An Improved Volume Profile Chart with Levels

Without much ado, here is the code
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{retval} =} market_profile_plot (@var{cross}, @var{n_bars})
##
## Plot a Market Profile Chart of CROSS of the last N_BARS.
##
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-11

function market_profile_plot( curr_cross , n_days )

pkg load statistics ; 
cd /path/to/data/folder ;
price_name = tolower( curr_cross ) ;

if ( strcmp( price_name , 'aud_jpy' ) || strcmp( price_name , 'eur_jpy' ) || strcmp( price_name , 'gbp_jpy' ) || ...
     strcmp( price_name , 'usd_jpy' ) )
 tick_size = 0.001 ;
 round_digit = 3 ;
elseif ( strcmp( price_name , 'xau_usd' ) )
 tick_size = 0.1 ;
 round_digit = 1 ;
elseif ( strcmp( price_name , 'xag_usd' ) )
 tick_size = 0.01 ;
 round_digit = 2 ; 
else
 tick_size = 0.0001 ;
 round_digit = 4 ;
endif

## get price data of *_ohlc_10m
unix_command = [ "wc" , " " , "-l" , " " , [ price_name , '_ohlc_10m' ] ] ;
## the 'wc' with '-l' flag command counts the number of lines in [ price_name , '_ohlc_20m' ] } 
[ ~ , system_out ] = system( unix_command ) ;
cstr = strsplit( system_out , " " ) ; 
lines_in_file = str2double( cstr( 1 , 1 ) ) ;

## read *_ohlc_10m file
price_data = dlmread( [ price_name , '_ohlc_10m' ] , ',' , [ lines_in_file - ( n_days * 144 + 18 ) , 0 , lines_in_file , 21 ] ) ;
## get the earliest London open on a Sunday, if any
sun_open_ix = find( ( price_data( : , 11 ) == 1 ) .* ( price_data( : , 9 ) == 22 ) .* ( price_data( : , 10 ) == 0 ) ) ;
## get weekday closes
end_ix = find( ( price_data( : , 15 ) == 16 ) .* ( price_data( : , 16 ) == 50  ) ) ;
delete_ix = unique( [ sun_open_ix ; end_ix ] ) ;
## delete uuwanted data
price_data( 1 : delete_ix( 1 ) , : ) = [] ; end_ix = end_ix .- delete_ix( 1 ) ; open_ix = end_ix .+ 1 ; 
end_ix( end_ix == 0 ) = [] ; end_ix( end_ix > size( price_data , 1 ) ) = [] ;
open_ix( open_ix == 0 ) = [] ; open_ix( open_ix > size( price_data , 1 ) ) = [] ;

## give names to data
open = price_data(:,18) ; high = price_data(:,19) ; low = price_data(:,20) ; close = price_data(:,21) ; vol = price_data(:,22) ;
high_round = floor( high ./ tick_size .+ 0.5 ) .* tick_size ;
low_round = floor( low ./ tick_size .+ 0.5 ) .* tick_size ;
max_tick_range = max( high_round .- low_round ) / tick_size ;
upper_val = high ; lower_val = low ;

## create y and x axes for chart
y_max = max( high_round ) + max_tick_range * tick_size ;
y_min = min( low_round ) - max_tick_range * tick_size ;
y_ax = ( y_min : tick_size : y_max )' ;
end_x_ax_freespace = 5 ;

## create container
all_vp = zeros( n_days , numel( y_ax ) ) ; all_mp = all_vp ;

if ( n_days == 1 )

[ all_vp(1,:) , vp_val ] = pcolor_background( y_ax , high , low , vol , tick_size ) ;
vp_z = repmat( all_vp( 1 , : ) , numel( high ) + end_x_ax_freespace , 1 ) ;
lower_val( : ) = vp_val( 1 ) ; upper_val( : ) = vp_val( 2 ) ; 

elseif ( n_days >= 2 )

vp_z = zeros( numel( high ) + end_x_ax_freespace , size( all_vp , 2 ) ) ;

 for ii = 1 : numel( end_ix ) 
 [ all_vp(ii,:) , vp_val ] = pcolor_background( y_ax , high(open_ix(ii):end_ix(ii)) , low(open_ix(ii):end_ix(ii)) , ...
                                                        vol(open_ix(ii):end_ix(ii)) , tick_size ) ;
 vp_z(open_ix(ii):end_ix(ii),:) = repmat( all_vp(ii,:)./max(all_vp(ii,:)) , numel( high(open_ix(ii):end_ix(ii)) ) , 1 ) ;
 lower_val( open_ix(ii) : end_ix(ii) ) = vp_val( 1 ) ; upper_val( open_ix(ii) : end_ix(ii) ) = vp_val( 2 ) ;
 endfor

[ all_vp(end,:) , vp_val ] = pcolor_background( y_ax , high(open_ix(end):end) , low(open_ix(end):end) , ...
                                                         vol(open_ix(end):end) , tick_size ) ;
vp_z( open_ix( end ) : end , : ) = repmat( all_vp( end , : ) ./ max( all_vp( end , : ) ) , ...
                                            numel( high( open_ix( end ) : end ) ) + end_x_ax_freespace , 1 ) ;
lower_val( open_ix( end ) : end ) = vp_val( 1 ) ; upper_val( open_ix( end ) : end ) = vp_val( 2 ) ;
endif

## create the background ( best choices - viridis and ocean? )
x_ax = ( 1 : 1 : numel( open ) + end_x_ax_freespace )' ;
colormap( 'viridis' ) ; figure( 10 ) ; pcolor( x_ax , y_ax , vp_z' ) ; shading interp ; axis tight ;

## plot the individual volume profiles
hold on ;

scale_factor = ( 1 / max(max(all_vp) ) ) * 72 ;
for ii = 1 : numel( open_ix )
figure( 10 ) ; fill( all_vp( ii , : ) .* scale_factor .+ open_ix( ii ) , y_ax' , [99;99;99]./255 ) ;
endfor

## plot candlesticks
figure( 10 ) ; candle_mp( high , low , close , open ) ;

## plot upper and lower boundaries of value area
hold on ; figure( 10 ) ; plot( lower_val , 'b' , 'linewidth' , 2 , upper_val , 'r' , 'linewidth' , 2 ) ; hold off ;

## Plot vertical lines for London open at 7am
london_ix = find( ( price_data( : , 9 ) == 7 ) .* ( price_data( : , 10 ) == 0 ) ) ;
if ( ~isempty( london_ix ) )
 for ii = 1 : numel( london_ix )
  figure( 10 ) ; vline( london_ix( ii ) , 'g' ) ;
 endfor
endif

endfunction
which calls this
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{vp_z}, @var{vp_val} =} pcolor_background (@var{y_ax}, @var{high}, @var{low}, @var{vol}, @var{tick_size})
##
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-13

function [ vp_z , vp_val ] = pcolor_background ( y_ax , high , low , vol , tick_size )

vp_z = zeros( 1 , numel( y_ax ) ) ; ##tpo_z = vp_z ;
vol( vol <= 1 ) = 2 ; ## no single point vol distributions
vp_val = zeros( 2 , 1 ) ;

 for ii = 1 : numel( high )

 ## the volume profile, vp_z
 ticks = norminv( linspace(0,1,vol(ii)+2) , (high(ii) + low(ii))/2 , (high(ii) - low(ii))*0.25 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ 0.5 ) .* tick_size ;
 unique_ticks = unique( ticks ) ;

  if ( numel( unique_ticks ) > 1 )
  [ N , X ] = hist( ticks , unique( ticks ) ) ;
  [ ~ , N_ix ] = max( N ) ; tick_ix = X( N_ix ) ;
  [ ~ , centre_tick ] = min( abs( y_ax .- tick_ix ) ) ;
  vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)) = vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)).+ N ;
  elseif ( numel( unique_ticks ) == 1 )
  [ ~ , centre_tick ] = min( abs( y_ax .- unique_ticks ) ) ;
  vp_z( 1 , centre_tick ) = vp_z( 1 , centre_tick ) + vol( ii ) ;
  endif

 endfor

[ ~ , vp_val_centre_ix ] = max( vp_z ) ;
sum_vp_cutoff = 0.7 * sum( vp_z ) ;
count = 1 ;

while ( count ~= 0 )
 
 sum_vp_z = sum( vp_z( max( vp_val_centre_ix - count , 1 ) : min( vp_val_centre_ix + count , numel( vp_z ) ) ) ) ;
 if ( sum_vp_z >= sum_vp_cutoff )
  vp_val( 1 , 1 ) = y_ax( max( vp_val_centre_ix - count , 1 ) ) ;             ## lower
  vp_val( 2 , 1 ) = y_ax( min( vp_val_centre_ix + count , numel( vp_z ) ) ) ; ## upper
  count = 0 ;
 else
  count = count + 1 ;
 endif

 endwhile

endfunction
and this
function hhh=vline(x,in1,in2)
% function h=vline(x, linetype, label)
% 
% Draws a vertical line on the current axes at the location specified by 'x'.  Optional arguments are
% 'linetype' (default is 'r:') and 'label', which applies a text label to the graph near the line.  The
% label appears in the same color as the line.
%
% The line is held on the current axes, and after plotting the line, the function returns the axes to
% its prior hold state.
%
% The HandleVisibility property of the line object is set to "off", so not only does it not appear on
% legends, but it is not findable by using findobj.  Specifying an output argument causes the function to
% return a handle to the line, so it can be manipulated or deleted.  Also, the HandleVisibility can be 
% overridden by setting the root's ShowHiddenHandles property to on.
%
% h = vline(42,'g','The Answer')
%
% returns a handle to a green vertical line on the current axes at x=42, and creates a text object on
% the current axes, close to the line, which reads "The Answer".
%
% vline also supports vector inputs to draw multiple lines at once.  For example,
%
% vline([4 8 12],{'g','r','b'},{'l1','lab2','LABELC'})
%
% draws three lines with the appropriate labels and colors.
% 
% By Brandon Kuczenski for Kensington Labs.
% brandon_kuczenski@kensingtonlabs.com
% 8 November 2001
if length(x)>1  % vector input
    for I=1:length(x)
        switch nargin
        case 1
            linetype='r:';
            label='';
        case 2
            if ~iscell(in1)
                in1={in1};
            end
            if I>length(in1)
                linetype=in1{end};
            else
                linetype=in1{I};
            end
            label='';
        case 3
            if ~iscell(in1)
                in1={in1};
            end
            if ~iscell(in2)
                in2={in2};
            end
            if I>length(in1)
                linetype=in1{end};
            else
                linetype=in1{I};
            end
            if I>length(in2)
                label=in2{end};
            else
                label=in2{I};
            end
        end
        h(I)=vline(x(I),linetype,label);
    end
else
    switch nargin
    case 1
        linetype='r:';
        label='';
    case 2
        linetype=in1;
        label='';
    case 3
        linetype=in1;
        label=in2;
    end
    
    
    
    g=ishold(gca);
    hold on
    y=get(gca,'ylim');
    h=plot([x x],y,linetype);
    if length(label)
        xx=get(gca,'xlim');
        xrange=xx(2)-xx(1);
        xunit=(x-xx(1))/xrange;
        if xunit<0 .8="" code="" color="" else="" end="" g="=0" get="" h="" handlevisibility="" hhh="h;" hold="" if="" label="" nargout="" off="" set="" tag="" text="" vline="" x-.05="" x="" xrange="" y="">
and produces charts such as this,
which is a 10 minute ohlc chart of the last 3 days, including "today's" ongoing price action. The number of days is a function input, and the horizontal blue and red lines indicate the upper and lower extremes of the value area. The vertical green lines indicate the London opening bar (7am BST) and each set of levels ends at the New York closing bar (5pm EST).

Further examples are last 10 days
and last month
Enjoy!


Monday, 18 May 2020

A Volume Profile With Levels Chart

Just a quick post to illustrate the latest of my ongoing chart iterations which combines a levels chart, as I have recently been posting about, but with the addition of a refined methodology of creating the horizontal histograms to more clearly represent the volumes over distinct periods.
The main change is to replace the use of the Octave barh function with the fill function. A minimal working example of this plotting is given in the code box below.
## get price data of *_ohlc_10m
unix_command = [ "wc" , " " , "-l" , " " , [ price_name , '_ohlc_10m' ] ] ;
## the 'wc' with '-l' flag command counts the number of lines in [ price_name , '_ohlc_20m' ] } 
[ ~ , system_out ] = system( unix_command ) ;
cstr = strsplit( system_out , " " ) ; 
lines_in_file = str2double( cstr( 1 , 1 ) ) ;

## read *_ohlc_10m file
price_data = dlmread( [ price_name , '_ohlc_10m' ] , ',' , [ lines_in_file - n_bars , 0 , lines_in_file , 21 ] ) ;
open = price_data(:,18) ; high = price_data(:,19) ; low = price_data(:,20) ; close = price_data(:,21) ; vol = price_data(:,22) ;
high_round = floor( high ./ tick_size .+ 0.5 ) .* tick_size ;
low_round = floor( low ./ tick_size .+ 0.5 ) .* tick_size ;
max_tick_range = max( high_round .- low_round ) / tick_size ;

## create y and x axes for chart
y_max = max( high_round ) + max_tick_range * tick_size ;
y_min = min( low_round ) - max_tick_range * tick_size ;
y_ax = ( y_min : tick_size : y_max )' ;
end_x_ax_freespace = 5 ;

all_vp = zeros( 3 , numel( y_ax ) ) ;

all_vp( 1 , : ) = pcolor_background( y_ax , high(1:50) , low(1:50) , vol(1:50) , tick_size ) ;
all_vp( 2 , : ) = pcolor_background( y_ax , high(51:100) , low(51:100) , vol(51:100) , tick_size ) ;
all_vp( 3 , : ) = pcolor_background( y_ax , high(100:150) , low(100:150) , vol(100:150) , tick_size ) ;

vp_z = repmat( sum( all_vp , 1 ) , numel( high ) + end_x_ax_freespace , 1 ) ;

x_ax = ( 1 : 1 : numel( open ) + end_x_ax_freespace )' ;
colormap( 'viridis' ) ; figure( 20 ) ; pcolor( x_ax , y_ax , vp_z' ) ; shading interp ; axis tight ;

## plot the individual volume profiles
hold on ;
scale_factor = 0.18 ; 
fill( all_vp( 1 , : ) .* scale_factor , y_ax' , [99;99;99]./255 ) ; 
fill( all_vp( 2 , : ) .* scale_factor .+ 50 , y_ax' , [99;99;99]./255 ) ;
fill( all_vp( 3 , : ) .* scale_factor .+ 100 , y_ax' , [99;99;99]./255 ) ;

## plot candlesticks
candle_mp( high , low , close , open ) ;
hold off;
I hope readers find this new way of plotting profile charts useful - I certainly am pretty pleased with it.

Friday, 31 July 2020

Currency Strength Candlestick Chart

In my previous posts on currency strength indices I have always visualised the indicator(s) as a line chart, e.g. here. However, after some deep thought, I have now created a way to visualise this as a candlestick chart using Octave's candle function, which, by the way, was written by me. Creating the candlestick body of a currency strength index was quite straight forward - just use the previous currency strength value as the bar's open and the current currency strength value as the close. A simple plot of this, with an overlaid currency strength index line chart, is

Of course the problem with this rendering is that there are no candlestick wicks.

My solution to create the wicks is showcased by the following code snippets
retval_high_wicks( ii , 6 ) = log( str2double( S.candles{ ii }.mid.h ) / max( str2double( S.candles{ ii }.mid.o ) , str2double( S.candles{ ii }.mid.c ) ) ) ;
retval_low_wicks( ii , 6 ) = log( str2double( S.candles{ ii }.mid.l ) / min( str2double( S.candles{ ii }.mid.o ) , str2double( S.candles{ ii }.mid.c ) ) ) ;
and
[ ii , ~ , v ] = find( [ retval_high_wicks( : , [32 33 34 35 36 37].+5 ) , -1.*retval_low_wicks( : , [5 20 28].+5 ) ] ) ;
new_index_high_wicks( : , 13 ) = accumarray( ii , v , [] , @mean ) ;
[ ii , ~ , v ] = find( [ retval_low_wicks( : , [32 33 34 35 36 37].+5 ) , -1.*retval_high_wicks( : , [5 20 28].+5 ) ] ) ;
new_index_low_wicks( : , 13 ) = accumarray( ii , v , [] , @mean ) ;
The first snippet shows an additional bit of code to the code here to record the log values of highs (lows) over (under) the candlestick bodies of all relevant currencies used in creating the currency strength indices.

The second snippet shows how the wicks are created, namely by taking the mean log values of high (low) wicks indexed by e.g.
[32 33 34 35 36 37].+5 and [5 20 28].+5
columns of downloaded forex crosses.

The reasoning behind this is as follows: take, for example, the EUR_USD forex pair - the upper wicks of these bars are recorded as upper wicks for the EUR index candles and as lower wicks for the USD index candles, reflecting the fact that upper wicks in EUR_USD can be viewed as intrabar EUR strength pushing to new highs or, alternatively, USD index candle's weakness pushing to new lows which, because the USD is the quote currency of the pair, also leads to new highs in the cross. A similar, reversed logic applies to the low wicks of the cross.

Below are charts of currency strength index candles created according to this methodology
The upper pane shows GBP currency strength index candles and the lower pane the same for USD. This is basically price action for Thursday, 30th July, 2020. The green vertical lines are the London and New York opens respectively, the red vertical line is the London close and the charts end at more or less the New York close. Bars prior to the London open are obviously the overnight Asian session.

My contemporaneous volume profile chart is the upper right pane below
 
From these charts it is easy to discern that the upward movement of GBP_USD during the main London session was due to GBP strength, whilst after the London close the continued upward movement of GBP_USD was due to USD weakness.

However, the point of this blog post was not to pass commentary on FX price movements, but to illustrate a methodology of creating candlestick charts for currency strength indices.

Enjoy!