Showing posts with label Market Profile. Show all posts
Showing posts with label Market Profile. Show all posts

Friday, 11 November 2022

A New PositionBook Chart Type

It has been almost 6 months since I last posted, due to working on a house renovation. However, I have still been thinking about/working on stuff, particularly on analysis of open position ratios. I had tried using this data as features for machine learning, but my thinking has evolved somewhat and I have reduced my ambition/expectation for this type of data.

Before I get into this I'd like to mention Trader Dale (I have no affiliation with him) as I have recently been following his volume profile set-ups, a screenshot of one being shown below.

This shows recent Wednesday action in the EUR_GBP pair on a 30 minute chart. The flexible volume profile set-up Trader Dale describes is called a Volume Accumulation Set-up which occurs immediately prior to a big break (in this case up). The whole premise of this particular set-up is that the volume accumulation area will be future support, off of which price will bounce, as shown by the "hand drawn" lines. Below is shown my version of the above chart
with a bit of extra price action included. The horizontal yellow lines show the support area.

Now here is the same data, but in what I'm calling a PositionBook chart, which uses Oanda's Position Level data downloaded via their API.

The blue (red) horizontal lines show the levels at which traders are net long (short) in terms of positions actually entered/held. The brighter the colours the greater the difference between the longs/shorts. It is obvious that the volume accumulation set-up area is showing a net accumulation of long positions and this is an indication of the direction of the anticipated breakout long before it happens. The Trader Dale set-up presumes an accumulation of longs because of the resultant breakout direction and doesn't seem to provide an opportunity to participate in the breakout itself!

The next chart shows the action of the following day and a bit where the price does indeed come back down to the "support" area but doesn't result in an immediate bounce off the support level. The following order level chart perhaps shows why there was no bounce - the relative absence of open orders at that level.

The equivalent PositionBook chart, including a bit more price action,
shows that after price fails to bounce off the support level it does recover back into it and then even more long positions are accumulated (the darker blue shade) at the support level during the London open, again allowing one to position oneself for the ensuing rise during the London morning session, followed by another long accumulation during the New York opening session for a following leg up into the London close (the last vertical red line).

This purpose of this post is not to criticise the Trader Dale set-up but rather to highlight the potential value-add of these new PositionBook charts. They seem to hold promise for indicating price direction and I intend to continue investigating/improving them in the coming weeks.

More in due course.

Saturday, 4 September 2021

"Matrix profile: Using Weakly Labeled Time Series to Predict Outcomes" Paper

Back in May of this year I posted about how I had intended to use Matrix Profile (MP) to somehow cluster the "initial balance" of Market Profile charts with a view to getting a heads up on immediately following price action. Since then, my thinking has evolved due to my learning about the paper "Matrix profile: Using Weakly Labeled Time Series to Predict Outcomes" and its companion website. This very much seems to accomplish the same end I had envisaged with my clustering of initial balances, so I am going to try and use this approach instead.

As a preliminary, I have decided to "weakly label" my time series data using the simple code loop shown below.

for ii = 1 : numel( ix )
  
y_values = train_data( ix( ii ) + 1 : ix( ii ) + 19 , 1 ) ;
london_session_ret = y_values( end ) - y_values( 1 ) ;

[ max_y , max_ix ] = max( y_values ) ;
max_long_ex = max_y - y_values( 1 ) ;

[ min_y , min_ix ] = min( y_values ) ;
max_short_ex = min_y - y_values( 1 ) ;

if ( london_session_ret > 0 && ( max_long_ex / ( -1 * max_short_ex ) ) >= 3 && max_ix > min_ix )
 labels( ix( ii ) - 11 : ix( ii ) , 1 ) = 1 ; 
elseif ( london_session_ret < 0 && ( max_short_ex / max_long_ex ) <= -3 && max_ix < min_ix )
  labels( ix( ii ) - 11 : ix( ii ) , 1 ) = -1 ;
endif
 
endfor
What this essentially does (for the long side) is ensure that price is higher at the end of y_values than at the beginning and there is a reward/risk opportunity of at least 3:1 for at least 1 trade during the period covered by the time range of y_values (either the London a.m. session or the combined New York a.m./London p.m. session) following a 7a.m. to 8.50a.m. (local time) formation of an opening Market profile/initial balance and the maximum adverse excursion occurs before the maximum favourable excursion. A typical chart on the long side looks like this.
This would have the "weak" label for a long trade, and the label would be applied to the Market Profile data that immediately precedes this price action. On the other side, a short labelled chart typically looks like this.
As can be seen, trading "against the label" offers few opportunities for profitable entries/exits. My hope is that a "dictionary" of long/short biased Market Profile patterns can be discovered using the ideas/code in the links above. For completeness, the following chart is typical of price action which does not meet the looped code bias for either long or short.

It is easy to envisage trading this type of price action by fading moves that go outside the "value area" of a Market Profile chart.

More in due course.


Friday, 27 August 2021

Another Iterative Improvement of my Volume/Market Profile Charts

Below is a screenshot of this new chart version, of today's (Friday's) price action at a 10 minute bar scale:

Just by looking at the chart it might not be obvious to readers what has changed, so the changes are detailed below.

The first change is in how the volume profile (the horizontal histogram on the left) is calculated. The "old" version of the chart calculates the profile by assuming the "model" that tick volume for each 10 minute bar is normally distributed across the high/low range of the bar, and then the profile histogram is the accumulation of these individual, 10 minute, normally distributed "mini profiles." A more complete description of this is given in my Market Profile Chart in Octave blog post, with code.

The new approach is more data centric rather than model based. Every 10 minutes, instead of downloading the 10 minute OHLC and tick volume, the last 10 minutes worth of 5 second OHLC and tick volume is downloaded. The whole tick volume of each 5 second period is assigned to a price level equivalent to the Typical price (rounded to the nearest pip) of said 5 second period, and the volume profile is then the accumulation of these volume ticks per price level. I think this is a much more accurate reflection of the price levels at which tick volume actually occurred compared to the old, model based charts. This second screenshot is of the old chart over the exact same price data as the first, improved version of the chart.

It can be seen that the two volume profile histograms of the respective charts differ from each other in terms of their overall shape and the number and price levels of peaks (Points of Control) and troughs (Low Volume Nodes).

The second change in the new chart is in how the background heatmap is plotted. The heatmap is a different presentation of the volume profile whereby higher volume price levels are shown by the brighter yellow colours. The old chart only displays the heatmap associated with the latest calculated volume profile histogram, which is projected back in time. This is, of course, a form of lookahead bias when plotting past prices over the latest heatmap. The new chart solves this by plotting a "rolling" version of the heatmap which reflects the volume profile that was in force at the time each 10 minute OHLC candle formed. It is easy to see how the Points of Control and Low Volume Nodes price levels ebb and flow throughout the trading day.

The third change, which naturally followed on from the downloading of 5 second data, is in the plotting of the candlesticks. Rather than having a normal, open to close candlestick body, the candlesticks show the "mini volume profiles" of the tick volume within each bar, plotted via Octave's patch function. The white candlestick wicks indicate the usual high/low range, and the open and close levels are shown by grey and black dots respectively. This is more clearly seen in the zoomed in screenshot below.

I wanted to plot these types of bars because recently I have watched some trading webcasts, which talked about "P", "b" and "D" shaped bar profiles at "areas of interest." The upshot of these webcasts is that, in general, a "P" bar is bullish, a "b" is bearish and a "D" is "in balance" when they intersect an "area of interest" such as Point of Control, Low Volume Node, support and resistance etc. This is supposed to be indicative of future price direction over the immediate short term. With this new version of chart, I shall be in a position to investigate these claims for myself.

Monday, 5 July 2021

Market Profile Low Volume Node Chart

As a diversion to my recent work with Matrix Profile I have recently completed work on a new chart type in Octave, namely a Market Profile Low Volume Node (LVN) chart, two slightly different versions of which are shown below.

This first one is derived from a TPO chart, whilst the next
is derived from a Volume profile chart.

The horizontal lines are drawn at levels which are considered to be "lows" in the underlying, but not shown, TPO/Volume profiles. The yellow lines are "stronger lows" than the green lines, and the blue lines are extensions of the previous day's "strong lows" in force at the end of that day's trading.

The point of all this, according to online guru theory, is that price is expected to be "rejected" at LVNs by either bouncing, a la support or resistance, or by price powering through the LVN level, usually on increased volume. The charts show the rolling development of the LVNs as the underlying profiles change throughout the day, hence lines can appear and disappear and change colour. As this is a new avenue of investigation for me I feel it is too soon to make a comment on these lines' efficacy, but it does seem uncanny how price very often seems to react to these levels.

More in due course.

Wednesday, 26 May 2021

Update on Recent Matrix Profile Work

Since my previous post, on Matrix Profile (MP), I have been doing a lot of online reading about MP and going back to various source papers and code that are available at the UCR Matrix Profile page. I have been doing this because, despite my initial enthusiasm, the R tsmp package didn't turn out to be suitable for what I wanted to do, or perhaps more correctly I couldn't hack it to get the sort of results I wanted, hence my need to go to "first principles" and code from the UCR page.

Readers may recall that my motivation was to look for time series motifs that form "initial balance (IB)" set ups of Market Profile charts. The rationale for this is that different IBs are precursors to specific market tendencies which may provide a clue or an edge in subsequent market action. A typical scenario from the literature on Market Profile might be "an Open Test Drive can often indicate one of the day's extremes." If this is actually true, one could go long/short with a high confidence stop at the identified extreme. Below is a screenshot of some typical IB profiles:

where each letter typically represents a 30 minute period of market action. The problem is that Market Profile charts, to me at least, are inherently visual and therefore do not easily lend themselves to an algorithmic treatment, which makes it difficult to back test in a robust fashion. This is why I have been trying to use MP.

The first challenge I faced was how to preprocess price action data such as OHLC and volume such that I could use MP. In the end I resorted to using the mid-price, the high-low range and (tick) volume as proxies for market direction, market volatility and market participation. Because IBs occur over market opens, I felt it was important to use the volatility and participation proxies as these are important markers for the sentiment of subsequent price action. This choice necessitated using a multivariate form of MP, and I used the basic MP STAMP code that is available at Matrix Profile VI: Meaningful Multidimensional Motif Discovery, with some slight tweaks for my use case.

Having the above tools in hand, what should they be used for? I decided that Cluster analysis is what is needed, i.e. cluster using the motifs that MP could discover. For this purpose, I used the approach outlined in section 3.9 of the paper "The Swiss Army Knife of Time Series Data Mining." The reasoning behind this choice is that if, for example, an "Open Test Drive IB" is a real thing, it should occur frequently enough that time series sub-sequences of it can be clustered or associated with an "Open Test Drive IB" motif. If all such prototype motifs can be identified and all IBs can be assigned to one of them, subsequent price action can be investigated to check the anecdotal claims, such as quoted above.

My Octave code implementation of the linked Swiss Army Knife routine is shown in the code box below.

data = dlmread( '/path/to/mv_data' ) ;
skip_loc = dlmread( '/path/to/skip_loc' ) ;
skip_loc_copy = find( skip_loc ) ; skip_loc_copy2 = skip_loc_copy ; skip_loc_copy3 = skip_loc_copy ;
sub_len = 9 ;
data_len = size( data , 1 ) ;
data_to_use = [ (data(:,2).+data(:,3))./2 , data(:,2).-data(:,3) , data(:,5) ] ;

must_dim = [] ;
exc_dim = [] ;
[ pro_mul , pro_idx , data_freq , data_mu , data_sig ] = multivariate_stamp( data_to_use, sub_len, must_dim, exc_dim, skip_loc ) ;
original_single_MP = pro_mul( : , 1 ) ; ## just mid price
original_single_MP2 = original_single_MP .+ pro_mul( : , 2 ) ; ## mid price and hi-lo range
original_single_MP3 = original_single_MP2 .+ pro_mul( : , 3 ) ; ## mid price, hi-lo range and volume

## Swiss Army Knife Clustering
RelMP = original_single_MP ; RelMP2 = original_single_MP2 ; RelMP3 = original_single_MP3 ;
DissMP = inf( length( RelMP ) , 1 ) ; DissMP2 = DissMP ; DissMP3 = DissMP ; 
minValStore = [] ; minIdxStore = [] ; minValStore2 = [] ; minIdxStore2 = [] ; minValStore3 = [] ; minIdxStore3 = [] ;
## set up a recording matrix 
all_dist_pro = zeros( size( RelMP , 1 ) , size( data_to_use , 2 ) ) ;

for ii = 1 : 500
## reset recording matrix for this ii loop  
all_dist_pro( : , : ) = 0 ;

## just mid price
[ minVal , minIdx ] = min( RelMP ) ;
minValStore = [ minValStore ; minVal ] ; minIdxStore = [ minIdxStore ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 1 ) = real( dist_pro ) ;
JMP = all_dist_pro( : , 1 ) ;
DissMP = min( DissMP , JMP ) ; ## dismiss all motifs discovered so far
RelMP = original_single_MP ./ DissMP ;
skip_loc_copy = unique( [ skip_loc_copy ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP( skip_loc_copy ) = 1 ;

## mid price and hi-lo range
[ minVal , minIdx ] = min( RelMP2 ) ;
minValStore2 = [ minValStore2 ; minVal ] ; minIdxStore2 = [ minIdxStore2 ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 2 ) = real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,2), DissmissRange(:,2), data_len, sub_len, data_mu(:,2), data_sig(:,2), data_mu(minIdx,2), data_sig(minIdx,2) ) ;
all_dist_pro( : , 2 ) = all_dist_pro( : , 2 ) .+ real( dist_pro ) ;
JMP2 = all_dist_pro( : , 2 ) ;
DissMP2 = min( DissMP2 , JMP2 ) ; ## dismiss all motifs discovered so far
RelMP2 = original_single_MP2 ./ DissMP2 ;
skip_loc_copy2 = unique( [ skip_loc_copy2 ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP2( skip_loc_copy2 ) = 1 ;

## mid price, hi-lo range and volume
[ minVal , minIdx ] = min( RelMP3 ) ;
minValStore3 = [ minValStore3 ; minVal ] ; minIdxStore3 = [ minIdxStore3 ; minIdx ] ;
DissmissRange = data_to_use( minIdx : minIdx + sub_len - 1 , : ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,1), DissmissRange(:,1), data_len, sub_len, data_mu(:,1), data_sig(:,1), data_mu(minIdx,1), data_sig(minIdx,1) ) ;
all_dist_pro( : , 3 ) = real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,2), DissmissRange(:,2), data_len, sub_len, data_mu(:,2), data_sig(:,2), data_mu(minIdx,2), data_sig(minIdx,2) ) ;
all_dist_pro( : , 3 ) = all_dist_pro( : , 3 ) .+ real( dist_pro ) ;
[ dist_pro , ~ ] = multivariate_mass (data_freq(:,3), DissmissRange(:,3), data_len, sub_len, data_mu(:,3), data_sig(:,3), data_mu(minIdx,3), data_sig(minIdx,3) ) ;
all_dist_pro( : , 3 ) = all_dist_pro( : , 3 ) .+ real( dist_pro ) ;
JMP3 = all_dist_pro( : , 3 ) ;
DissMP3 = min( DissMP3 , JMP3 ) ; ## dismiss all motifs discovered so far
RelMP3 = original_single_MP3 ./ DissMP3 ;
skip_loc_copy3 = unique( [ skip_loc_copy3 ; ( minIdx : 1 : minIdx + sub_len - 1 )' ] ) ;
RelMP3( skip_loc_copy3 ) = 1 ;

endfor ## end ii loop

There are a few things to note about this code:

  • the use of a skip_loc vector 
  • a sub_len value of 9
  • 3 different calculations for DissMP and RelMP vectors

i) The skip_loc vector is a vector of time series indices (Idx) for which the MP and possible cluster motifs should not be calculated to avoid identifying motifs from data sequences that do not occur in the underlying data due to the way I concatenated it during pre-processing, i.e. 7am to 9am, 7am to 9am, ... etc.

ii) sub_len value of 9 means 9 x 10 minute OHLC bars, to match the 30 minute A, B and C of the above IB screenshot.

iii)  3 different calculations because different combinations of the underlying data are used. 

This last part probably needs more explanation. A multivariate RelMP is created by adding together individual dist_pros (distance profiles), and the cluster motif identification is achieved by finding minimums in the RelMP; however, a minimum in a multivariate RelMP is generally a different minimum to the minimums of the individual, univariate RelMPs. What my code does is use a univariate RelMP of the mid price, and 2 multivariate RelMPs of mid price plus high-low range and mid price, high-low range and volume. This gives 3 sets of minValues and minValueIdxs, one for each set of data. The idea is to run the ii loop for, e.g. 500 iterations, and to then identify possible "robust" IB cluster motifs by using the Octave intersect function to get the minIdx that are common to all 3 sets of Idx data. 

By way of example, setting the ii loop iteration to just 100 results in only one intersect Idx value on some EUR_USD forex data, the plot of which is shown below:

Comparing this with the IB screenshot above, I would say this represents a typical "Open Auction" process with prices rotating upwards/downwards with no real conviction either way, with a possible long breakout on the last bar or alternatively, a last upwards test before a price plunge.

My intent is to use the above methodology to get a set of candidate IB motifs upon which a clustering algorithm can be based. This clustering algorithm will be the subject of my next post.

Friday, 5 February 2021

A Forex Pair Snapshot Chart

After yesterday's Heatmap Plot of Forex Temporal Clustering post I thought I would consolidate all the chart types I have recently created into one easy, snapshot overview type of chart. Below is a typical example of such a chart, this being today's 10 minute EUR_USD forex pair chart up to a few hours after the London session close (the red vertical line).


The top left chart is a Market/Volume Profile Chart with added rolling Value Area upper and lower bounds (the cyan, red and white lines) and also rolling Volume Weighted Average Price with upper and lower standard deviation lines (magenta).

The bottom left chart is the turning point heatmap chart as described in yesterday's post.

The two rightmost charts are also Market/Volume Profile charts, but of my Currency Strength Candlestick Charts based on my Currency Strength Indicator. The upper one is the base currency, i.e. EUR, and the lower is the quote currency. 

The following charts are the same day's charts for:

GBP_USD,

USD_CHF
and finally USD_JPY
The regularity of the turning points is easily seen in the lower lefthand charts although, of course, this is to be expected as they all share the USD as a common currency. However, there are also subtle differences to be seen in the "shadows" of the lighter areas.

For the nearest future my self-assigned task will be to observe the forex pairs, in real time, through the prism of the above style of chart and do some mental paper trading, and perhaps some really small size, discretionary live trading, in additional to my normal routine of research and development.


Friday, 31 July 2020

Currency Strength Candlestick Chart

In my previous posts on currency strength indices I have always visualised the indicator(s) as a line chart, e.g. here. However, after some deep thought, I have now created a way to visualise this as a candlestick chart using Octave's candle function, which, by the way, was written by me. Creating the candlestick body of a currency strength index was quite straight forward - just use the previous currency strength value as the bar's open and the current currency strength value as the close. A simple plot of this, with an overlaid currency strength index line chart, is

Of course the problem with this rendering is that there are no candlestick wicks.

My solution to create the wicks is showcased by the following code snippets
retval_high_wicks( ii , 6 ) = log( str2double( S.candles{ ii }.mid.h ) / max( str2double( S.candles{ ii }.mid.o ) , str2double( S.candles{ ii }.mid.c ) ) ) ;
retval_low_wicks( ii , 6 ) = log( str2double( S.candles{ ii }.mid.l ) / min( str2double( S.candles{ ii }.mid.o ) , str2double( S.candles{ ii }.mid.c ) ) ) ;
and
[ ii , ~ , v ] = find( [ retval_high_wicks( : , [32 33 34 35 36 37].+5 ) , -1.*retval_low_wicks( : , [5 20 28].+5 ) ] ) ;
new_index_high_wicks( : , 13 ) = accumarray( ii , v , [] , @mean ) ;
[ ii , ~ , v ] = find( [ retval_low_wicks( : , [32 33 34 35 36 37].+5 ) , -1.*retval_high_wicks( : , [5 20 28].+5 ) ] ) ;
new_index_low_wicks( : , 13 ) = accumarray( ii , v , [] , @mean ) ;
The first snippet shows an additional bit of code to the code here to record the log values of highs (lows) over (under) the candlestick bodies of all relevant currencies used in creating the currency strength indices.

The second snippet shows how the wicks are created, namely by taking the mean log values of high (low) wicks indexed by e.g.
[32 33 34 35 36 37].+5 and [5 20 28].+5
columns of downloaded forex crosses.

The reasoning behind this is as follows: take, for example, the EUR_USD forex pair - the upper wicks of these bars are recorded as upper wicks for the EUR index candles and as lower wicks for the USD index candles, reflecting the fact that upper wicks in EUR_USD can be viewed as intrabar EUR strength pushing to new highs or, alternatively, USD index candle's weakness pushing to new lows which, because the USD is the quote currency of the pair, also leads to new highs in the cross. A similar, reversed logic applies to the low wicks of the cross.

Below are charts of currency strength index candles created according to this methodology
The upper pane shows GBP currency strength index candles and the lower pane the same for USD. This is basically price action for Thursday, 30th July, 2020. The green vertical lines are the London and New York opens respectively, the red vertical line is the London close and the charts end at more or less the New York close. Bars prior to the London open are obviously the overnight Asian session.

My contemporaneous volume profile chart is the upper right pane below
 
From these charts it is easy to discern that the upward movement of GBP_USD during the main London session was due to GBP strength, whilst after the London close the continued upward movement of GBP_USD was due to USD weakness.

However, the point of this blog post was not to pass commentary on FX price movements, but to illustrate a methodology of creating candlestick charts for currency strength indices.

Enjoy!

Wednesday, 20 May 2020

An Improved Volume Profile Chart with Levels

Without much ado, here is the code
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{retval} =} market_profile_plot (@var{cross}, @var{n_bars})
##
## Plot a Market Profile Chart of CROSS of the last N_BARS.
##
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-11

function market_profile_plot( curr_cross , n_days )

pkg load statistics ; 
cd /path/to/data/folder ;
price_name = tolower( curr_cross ) ;

if ( strcmp( price_name , 'aud_jpy' ) || strcmp( price_name , 'eur_jpy' ) || strcmp( price_name , 'gbp_jpy' ) || ...
     strcmp( price_name , 'usd_jpy' ) )
 tick_size = 0.001 ;
 round_digit = 3 ;
elseif ( strcmp( price_name , 'xau_usd' ) )
 tick_size = 0.1 ;
 round_digit = 1 ;
elseif ( strcmp( price_name , 'xag_usd' ) )
 tick_size = 0.01 ;
 round_digit = 2 ; 
else
 tick_size = 0.0001 ;
 round_digit = 4 ;
endif

## get price data of *_ohlc_10m
unix_command = [ "wc" , " " , "-l" , " " , [ price_name , '_ohlc_10m' ] ] ;
## the 'wc' with '-l' flag command counts the number of lines in [ price_name , '_ohlc_20m' ] } 
[ ~ , system_out ] = system( unix_command ) ;
cstr = strsplit( system_out , " " ) ; 
lines_in_file = str2double( cstr( 1 , 1 ) ) ;

## read *_ohlc_10m file
price_data = dlmread( [ price_name , '_ohlc_10m' ] , ',' , [ lines_in_file - ( n_days * 144 + 18 ) , 0 , lines_in_file , 21 ] ) ;
## get the earliest London open on a Sunday, if any
sun_open_ix = find( ( price_data( : , 11 ) == 1 ) .* ( price_data( : , 9 ) == 22 ) .* ( price_data( : , 10 ) == 0 ) ) ;
## get weekday closes
end_ix = find( ( price_data( : , 15 ) == 16 ) .* ( price_data( : , 16 ) == 50  ) ) ;
delete_ix = unique( [ sun_open_ix ; end_ix ] ) ;
## delete uuwanted data
price_data( 1 : delete_ix( 1 ) , : ) = [] ; end_ix = end_ix .- delete_ix( 1 ) ; open_ix = end_ix .+ 1 ; 
end_ix( end_ix == 0 ) = [] ; end_ix( end_ix > size( price_data , 1 ) ) = [] ;
open_ix( open_ix == 0 ) = [] ; open_ix( open_ix > size( price_data , 1 ) ) = [] ;

## give names to data
open = price_data(:,18) ; high = price_data(:,19) ; low = price_data(:,20) ; close = price_data(:,21) ; vol = price_data(:,22) ;
high_round = floor( high ./ tick_size .+ 0.5 ) .* tick_size ;
low_round = floor( low ./ tick_size .+ 0.5 ) .* tick_size ;
max_tick_range = max( high_round .- low_round ) / tick_size ;
upper_val = high ; lower_val = low ;

## create y and x axes for chart
y_max = max( high_round ) + max_tick_range * tick_size ;
y_min = min( low_round ) - max_tick_range * tick_size ;
y_ax = ( y_min : tick_size : y_max )' ;
end_x_ax_freespace = 5 ;

## create container
all_vp = zeros( n_days , numel( y_ax ) ) ; all_mp = all_vp ;

if ( n_days == 1 )

[ all_vp(1,:) , vp_val ] = pcolor_background( y_ax , high , low , vol , tick_size ) ;
vp_z = repmat( all_vp( 1 , : ) , numel( high ) + end_x_ax_freespace , 1 ) ;
lower_val( : ) = vp_val( 1 ) ; upper_val( : ) = vp_val( 2 ) ; 

elseif ( n_days >= 2 )

vp_z = zeros( numel( high ) + end_x_ax_freespace , size( all_vp , 2 ) ) ;

 for ii = 1 : numel( end_ix ) 
 [ all_vp(ii,:) , vp_val ] = pcolor_background( y_ax , high(open_ix(ii):end_ix(ii)) , low(open_ix(ii):end_ix(ii)) , ...
                                                        vol(open_ix(ii):end_ix(ii)) , tick_size ) ;
 vp_z(open_ix(ii):end_ix(ii),:) = repmat( all_vp(ii,:)./max(all_vp(ii,:)) , numel( high(open_ix(ii):end_ix(ii)) ) , 1 ) ;
 lower_val( open_ix(ii) : end_ix(ii) ) = vp_val( 1 ) ; upper_val( open_ix(ii) : end_ix(ii) ) = vp_val( 2 ) ;
 endfor

[ all_vp(end,:) , vp_val ] = pcolor_background( y_ax , high(open_ix(end):end) , low(open_ix(end):end) , ...
                                                         vol(open_ix(end):end) , tick_size ) ;
vp_z( open_ix( end ) : end , : ) = repmat( all_vp( end , : ) ./ max( all_vp( end , : ) ) , ...
                                            numel( high( open_ix( end ) : end ) ) + end_x_ax_freespace , 1 ) ;
lower_val( open_ix( end ) : end ) = vp_val( 1 ) ; upper_val( open_ix( end ) : end ) = vp_val( 2 ) ;
endif

## create the background ( best choices - viridis and ocean? )
x_ax = ( 1 : 1 : numel( open ) + end_x_ax_freespace )' ;
colormap( 'viridis' ) ; figure( 10 ) ; pcolor( x_ax , y_ax , vp_z' ) ; shading interp ; axis tight ;

## plot the individual volume profiles
hold on ;

scale_factor = ( 1 / max(max(all_vp) ) ) * 72 ;
for ii = 1 : numel( open_ix )
figure( 10 ) ; fill( all_vp( ii , : ) .* scale_factor .+ open_ix( ii ) , y_ax' , [99;99;99]./255 ) ;
endfor

## plot candlesticks
figure( 10 ) ; candle_mp( high , low , close , open ) ;

## plot upper and lower boundaries of value area
hold on ; figure( 10 ) ; plot( lower_val , 'b' , 'linewidth' , 2 , upper_val , 'r' , 'linewidth' , 2 ) ; hold off ;

## Plot vertical lines for London open at 7am
london_ix = find( ( price_data( : , 9 ) == 7 ) .* ( price_data( : , 10 ) == 0 ) ) ;
if ( ~isempty( london_ix ) )
 for ii = 1 : numel( london_ix )
  figure( 10 ) ; vline( london_ix( ii ) , 'g' ) ;
 endfor
endif

endfunction
which calls this
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{vp_z}, @var{vp_val} =} pcolor_background (@var{y_ax}, @var{high}, @var{low}, @var{vol}, @var{tick_size})
##
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-13

function [ vp_z , vp_val ] = pcolor_background ( y_ax , high , low , vol , tick_size )

vp_z = zeros( 1 , numel( y_ax ) ) ; ##tpo_z = vp_z ;
vol( vol <= 1 ) = 2 ; ## no single point vol distributions
vp_val = zeros( 2 , 1 ) ;

 for ii = 1 : numel( high )

 ## the volume profile, vp_z
 ticks = norminv( linspace(0,1,vol(ii)+2) , (high(ii) + low(ii))/2 , (high(ii) - low(ii))*0.25 ) ;
 ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ 0.5 ) .* tick_size ;
 unique_ticks = unique( ticks ) ;

  if ( numel( unique_ticks ) > 1 )
  [ N , X ] = hist( ticks , unique( ticks ) ) ;
  [ ~ , N_ix ] = max( N ) ; tick_ix = X( N_ix ) ;
  [ ~ , centre_tick ] = min( abs( y_ax .- tick_ix ) ) ;
  vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)) = vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)).+ N ;
  elseif ( numel( unique_ticks ) == 1 )
  [ ~ , centre_tick ] = min( abs( y_ax .- unique_ticks ) ) ;
  vp_z( 1 , centre_tick ) = vp_z( 1 , centre_tick ) + vol( ii ) ;
  endif

 endfor

[ ~ , vp_val_centre_ix ] = max( vp_z ) ;
sum_vp_cutoff = 0.7 * sum( vp_z ) ;
count = 1 ;

while ( count ~= 0 )
 
 sum_vp_z = sum( vp_z( max( vp_val_centre_ix - count , 1 ) : min( vp_val_centre_ix + count , numel( vp_z ) ) ) ) ;
 if ( sum_vp_z >= sum_vp_cutoff )
  vp_val( 1 , 1 ) = y_ax( max( vp_val_centre_ix - count , 1 ) ) ;             ## lower
  vp_val( 2 , 1 ) = y_ax( min( vp_val_centre_ix + count , numel( vp_z ) ) ) ; ## upper
  count = 0 ;
 else
  count = count + 1 ;
 endif

 endwhile

endfunction
and this
function hhh=vline(x,in1,in2)
% function h=vline(x, linetype, label)
% 
% Draws a vertical line on the current axes at the location specified by 'x'.  Optional arguments are
% 'linetype' (default is 'r:') and 'label', which applies a text label to the graph near the line.  The
% label appears in the same color as the line.
%
% The line is held on the current axes, and after plotting the line, the function returns the axes to
% its prior hold state.
%
% The HandleVisibility property of the line object is set to "off", so not only does it not appear on
% legends, but it is not findable by using findobj.  Specifying an output argument causes the function to
% return a handle to the line, so it can be manipulated or deleted.  Also, the HandleVisibility can be 
% overridden by setting the root's ShowHiddenHandles property to on.
%
% h = vline(42,'g','The Answer')
%
% returns a handle to a green vertical line on the current axes at x=42, and creates a text object on
% the current axes, close to the line, which reads "The Answer".
%
% vline also supports vector inputs to draw multiple lines at once.  For example,
%
% vline([4 8 12],{'g','r','b'},{'l1','lab2','LABELC'})
%
% draws three lines with the appropriate labels and colors.
% 
% By Brandon Kuczenski for Kensington Labs.
% brandon_kuczenski@kensingtonlabs.com
% 8 November 2001
if length(x)>1  % vector input
    for I=1:length(x)
        switch nargin
        case 1
            linetype='r:';
            label='';
        case 2
            if ~iscell(in1)
                in1={in1};
            end
            if I>length(in1)
                linetype=in1{end};
            else
                linetype=in1{I};
            end
            label='';
        case 3
            if ~iscell(in1)
                in1={in1};
            end
            if ~iscell(in2)
                in2={in2};
            end
            if I>length(in1)
                linetype=in1{end};
            else
                linetype=in1{I};
            end
            if I>length(in2)
                label=in2{end};
            else
                label=in2{I};
            end
        end
        h(I)=vline(x(I),linetype,label);
    end
else
    switch nargin
    case 1
        linetype='r:';
        label='';
    case 2
        linetype=in1;
        label='';
    case 3
        linetype=in1;
        label=in2;
    end
    
    
    
    g=ishold(gca);
    hold on
    y=get(gca,'ylim');
    h=plot([x x],y,linetype);
    if length(label)
        xx=get(gca,'xlim');
        xrange=xx(2)-xx(1);
        xunit=(x-xx(1))/xrange;
        if xunit<0 .8="" code="" color="" else="" end="" g="=0" get="" h="" handlevisibility="" hhh="h;" hold="" if="" label="" nargout="" off="" set="" tag="" text="" vline="" x-.05="" x="" xrange="" y="">
and produces charts such as this,
which is a 10 minute ohlc chart of the last 3 days, including "today's" ongoing price action. The number of days is a function input, and the horizontal blue and red lines indicate the upper and lower extremes of the value area. The vertical green lines indicate the London opening bar (7am BST) and each set of levels ends at the New York closing bar (5pm EST).

Further examples are last 10 days
and last month
Enjoy!


Monday, 18 May 2020

A Volume Profile With Levels Chart

Just a quick post to illustrate the latest of my ongoing chart iterations which combines a levels chart, as I have recently been posting about, but with the addition of a refined methodology of creating the horizontal histograms to more clearly represent the volumes over distinct periods.
The main change is to replace the use of the Octave barh function with the fill function. A minimal working example of this plotting is given in the code box below.
## get price data of *_ohlc_10m
unix_command = [ "wc" , " " , "-l" , " " , [ price_name , '_ohlc_10m' ] ] ;
## the 'wc' with '-l' flag command counts the number of lines in [ price_name , '_ohlc_20m' ] } 
[ ~ , system_out ] = system( unix_command ) ;
cstr = strsplit( system_out , " " ) ; 
lines_in_file = str2double( cstr( 1 , 1 ) ) ;

## read *_ohlc_10m file
price_data = dlmread( [ price_name , '_ohlc_10m' ] , ',' , [ lines_in_file - n_bars , 0 , lines_in_file , 21 ] ) ;
open = price_data(:,18) ; high = price_data(:,19) ; low = price_data(:,20) ; close = price_data(:,21) ; vol = price_data(:,22) ;
high_round = floor( high ./ tick_size .+ 0.5 ) .* tick_size ;
low_round = floor( low ./ tick_size .+ 0.5 ) .* tick_size ;
max_tick_range = max( high_round .- low_round ) / tick_size ;

## create y and x axes for chart
y_max = max( high_round ) + max_tick_range * tick_size ;
y_min = min( low_round ) - max_tick_range * tick_size ;
y_ax = ( y_min : tick_size : y_max )' ;
end_x_ax_freespace = 5 ;

all_vp = zeros( 3 , numel( y_ax ) ) ;

all_vp( 1 , : ) = pcolor_background( y_ax , high(1:50) , low(1:50) , vol(1:50) , tick_size ) ;
all_vp( 2 , : ) = pcolor_background( y_ax , high(51:100) , low(51:100) , vol(51:100) , tick_size ) ;
all_vp( 3 , : ) = pcolor_background( y_ax , high(100:150) , low(100:150) , vol(100:150) , tick_size ) ;

vp_z = repmat( sum( all_vp , 1 ) , numel( high ) + end_x_ax_freespace , 1 ) ;

x_ax = ( 1 : 1 : numel( open ) + end_x_ax_freespace )' ;
colormap( 'viridis' ) ; figure( 20 ) ; pcolor( x_ax , y_ax , vp_z' ) ; shading interp ; axis tight ;

## plot the individual volume profiles
hold on ;
scale_factor = 0.18 ; 
fill( all_vp( 1 , : ) .* scale_factor , y_ax' , [99;99;99]./255 ) ; 
fill( all_vp( 2 , : ) .* scale_factor .+ 50 , y_ax' , [99;99;99]./255 ) ;
fill( all_vp( 3 , : ) .* scale_factor .+ 100 , y_ax' , [99;99;99]./255 ) ;

## plot candlesticks
candle_mp( high , low , close , open ) ;
hold off;
I hope readers find this new way of plotting profile charts useful - I certainly am pretty pleased with it.

Saturday, 16 May 2020

A Comparison of Charts

Earlier in May I posted about Market Profile with some charts and video. Further work on this has made me realise that my earlier post should more accurately be described as Volume Profile, so apologies to readers for that.

Another, similar type of chart I have seen described as a TPO chart (TPO stands for 'That Price Occurred' or ticked) and it is a simple matter to extend the code in the above linked post to create a TPO chart and below is the Octave function I have written to produce the backgrounds for both types of plot
## Copyright (C) 2020 dekalog
## 
## This program is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
## 
## This program is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
## GNU General Public License for more details.
## 
## You should have received a copy of the GNU General Public License
## along with this program.  If not, see
## .

## -*- texinfo -*- 
## @deftypefn {} {@var{background} =} pcolor_background (@var{y_ax}, @var{high}, @var{low}, @var{vol}, @var{tick_size})
##
## Creates a matrix, BACKGOUND, to be used by Market Profile plotting functions,
## which need a colour background matrix to be plotted by pcolor.
## 
## @seealso{}
## @end deftypefn

## Author: dekalog 
## Created: 2020-05-13

function [ vp_background , mp_background ] = pcolor_background ( y_ax , high , low , vol , tick_size )

vp_z = zeros( 1 , numel( y_ax ) ) ; mp_z = vp_z ;
vol( vol <= 1 ) = 2 ; ## no single point vol distributions

for ii = 1 : numel( high )

## the volume profile, vp_background
ticks = norminv( linspace(0,1,vol(ii)+2) , (high(ii) + low(ii))/2 , (high(ii) - low(ii))*0.25 ) ;
ticks = floor( ticks( 2 : end - 1 ) ./ tick_size .+ 0.5 ) .* tick_size ;
unique_ticks = unique( ticks ) ;

if ( numel( unique_ticks ) > 1 )
[ N , X ] = hist( ticks , unique( ticks ) ) ;
[ ~ , N_ix ] = max( N ) ; tick_ix = X( N_ix ) ;
[ ~ , centre_tick ] = min( abs( y_ax .- tick_ix ) ) ;
vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)) = vp_z(1,centre_tick-N_ix+1:centre_tick+(numel(N)-N_ix)).+ N ;
elseif ( numel( unique_ticks ) == 1 )
[ ~ , centre_tick ] = min( abs( y_ax .- unique_ticks ) ) ;
vp_z( 1 , centre_tick ) = vp_z( 1 , centre_tick ) + vol( ii ) ;
endif

## the market profile, mp_background
[ ~ , ix_high ] = min( abs( y_ax .- high( ii ) ) ) ;
[ ~ , ix_low ] = min( abs( y_ax .- low( ii ) ) ) ;
mp_z( 1 , ix_low : ix_high ) = mp_z( 1 , ix_low : ix_high ) .+ 1 ;

endfor

vp_background = repmat( vp_z , numel( high ) , 1 ) ;
mp_background = repmat( mp_z , numel( high ) , 1 ) ;

endfunction
I have elected to still call the TPO plot a Market Profile plot as, from what I can make out, the tick count of the TPO is intended to be a surrogate for the original, cleared volume of Market Profile.

The above function is intended to provide a matrix input for the pcolor function, which internally scales the matrix to 0-1. Another idea I have had is to multiply the Volume Profile matrix and the Market Profile matrix together to get a normalised Combined Profile matrix. The animated GIF below shows all three.
It can be seen that there are subtle differences between them but that, on the whole, the results are similar.

More in due course.