Improving Order Execution in FX – Rethinking TCA

There has been a marked uptick in the use of TCA (Transaction Cost Analysis) services within the FX market over the last few years. Doubtless significantly inspired by the FX Global Code and MiFID II, the use of TCA in FX has been suggested in some quarters to be a box-ticking exercise (although this charge is similarly levelled at TCA in other asset classes too). This is unfortunate, and maybe it’s time for a name change. The very name ‘TCA’ suggests an after-the-fact analysis which may never see the light of day again. Implicit here is the need for ‘pre-trade TCA’: helping traders intelligently decide the most appropriate execution algo for the required objective (how to execute) and on which venue (where to execute). Note that we use ‘execution algo’ here in the broadest sense, covering the method of order execution in general; whether using market, limit and other order types or true execution algos such as TWAP. We use ‘venue’ to include ECNs, banks and non-bank liquidity. Whatever the requirements are for regulatory, or indeed client reporting, surely the most important aspect of TCA (or ‘execution analysis’) is the use of post-trade analytics to inform and improve current and future (pre-trade) order execution.

Optimising execution quality means performing ongoing execution analysis in real-time to enable traders to make informed decisions for subsequent orders. The key point here is that market characteristics (e.g. volume, spreads and volatility) continually change. Therefore, measurement of execution quality should be continuous and part of the workflow of order execution in order that decisions on algo selection and venue are based on current market conditions and the extent to which they deviate from historical market conditions for similar orders.

The essential component required for such on-going analysis and decision-making is recording faithfully, in real-time, the full depth of the order book for each liquidity venue to which the firm is connected. With multiple liquidity providers (LPs), this is the proverbial “drinking from the firehose” which requires the ability for the trading system to ingest quote updates and trades at rates measured in hundreds of thousands per second. A key aspect here is that this quote recording system is plugged into the trading firm’s production trading infrastructure. As such, the latencies and infrastructure implicit in the trading firm’s particular set-up are baked into the historical time-series thus recorded, and it is more straightforward to intertwine orders, executions and the state of the order book at the time of each order and execution.

Once this recording system is part of the trading infrastructure, a time-series of trading-firm specific quotes and trades is automatically created and maintained. Again, by being part of the production trading infrastructure, this time-series production receives the necessary care and attention befitting such a valuable resource. This systematically maintained time-series database (or better, ‘knowledge base’) is the cornerstone of implementing a systematic approach to improving FX execution quality by appropriate algo and venue selection.

Of course, the lack of a central tape or official close prices as in equities and futures markets begs the question as to the relevant benchmark for measuring FX execution quality. But, when a firm is using TCA as part of a solution to improve execution quality (versus box-ticking), then the benchmark can and should be specific to the objectives of that firm’s FX trading (e.g. hedging versus intra-day trading).

Because of the fragmented nature of liquidity in the FX market, improving FX order execution is not just about how to execute (i.e. which execution algo to use). FX execution analysis has to operate across venues, so where to execute is also critical. This is preferably undertaken in real-time by a smart order routing (SOR) algo. SOR algos determine venue selection typically by looking at the best bid or offer simultaneously provided by each of the connected LPs. Depending on the time of day, currency pair, order size, required aggressiveness, etc, the SOR algo may send child orders to multiple LPs and may use multiple levels of liquidity. The intelligent SOR algo accounts for the historical fill ratio and rejections. Even if a venue offers the current best price, it may be more risky to route the flow to this venue if historically this location has a high rejection rate. Thus, it is essential to SOR operation to have access to the time-series of market data, orders and actual executions in order to calculate fill and rejection profiles for each liquidity venue. The calibration of the SOR algo should be continually evaluated. This is done by back-testing candidate SOR algos (including different parameterisation of the ‘same’ algo) against the firm’s own knowledge base of market data, orders and executions.

In summary, we need to combine traditional FX TCA with liquidity analysis across multiple venues and provide pre-trade analytics on current and historical data to provide traders with relevant and current analytics to optimise execution quality for the next order.

Related Content:

Order Execution Resources
AlgoCompass

This blog post by Stuart Farr originally appeared on the Best Execution website.

Using Deltix for Trading Cryptocurrencies and Bitcoin Futures

Bitcoin: New Asset Class or Latest Bubble?

The surge in both the price and corresponding interest in trading cryptocurrencies including bitcoin has been one of the main stories in the financial markets during 2017. Historians may point to the launch of bitcoin futures by the CME and CBOE exchanges as the turning point in the acceptance of cryptocurrencies as “institutional” with the ability for trading firms, in addition to trading “spot crypto”, to gain exposure (long and short) to cryptocurrencies (specifically bitcoin) within the safety of regulated futures exchanges.

Far be it for us to predict whether cryptocurrencies are a new asset class or bubble, but the native architecture of the Deltix Product Suite has made it straightforward for us to provide research and trading capabilities for cryptocurrencies.

This means that Deltix clients can now bring best-in-class institutional quantitative analysis and trading (manual and automated) capabilities to cryptocurrencies and their new derivatives (futures).

Crypto Connectivity

Deltix has built data and trading adapters for several bitcoin venues, such as GDAX, Bitfinex and Gemini. This is an ongoing process with new venues being added weekly. Because these venues are so new, many do not have the technology sophistication that institutional traders enjoy at traditional stock and futures exchanges and forex venues.

Deltix aggregates raw order book data from these cryptocurrency venues and has “normalized” connectivity to these venues. As such, users interact directly with the Deltix software (whether for manual or automated trading) rather than navigate the still-evolving technology of crypto trading venues.

In addition to streaming real-time data from each connected venue, crypto tick data is also stored in TimeBase thereby creating an archive of historical data for subsequent analysis. By providing this capability for multiple venues simultaneously, differences in quotes between venues can be analyzed to better understand market structure and enable institutional grade alpha generation and execution management.

Trading Cryptocurrencies and Bitcoin Futures

For trading, the full capabilities of Deltix StrategyServer and ExecutionServer are available: including trading risk checks, position keeping, profit and loss tracking and resilient connectivity.

Consistent with Deltix capabilities with other asset classes, users can trade manually via our TradingConsole or deploy automated trading strategies developed and back-tested in QuantOffice. The Deltix automated trading platform provides latency measures in micro-seconds. Users wishing to bring their own analytics to bear can take advantage of Deltix’ extensive APIs for C#, C++, Java and Python.

For more information on how you can use the Deltix platform for trading cryptocurrencies and bitcoin futures, please contact us.

 

Update: Generating Alpha with Earnings Date Revisions

Deltix presents and update to research using data from Wall Street Horizon

Research Update

This study is an update of the research using data from Wall Street Horizon that we first published on our blog in November 2015. At that time, we had data from January 2006 to September 2015. Now we have data to March 2017. We continue to be impressed by the stability of the returns.

We showed our approach for this study in our webinar on June 8th as an example of alpha research on our new DeltixQuantHub platform.

Introduction

Company earnings are the bedrock of financial analysis and investment. Sell-side, buy-side and independent research analysts perform quantitative and qualitative analysis of companies, their peers and their markets in order to provide guidance for short-term earnings and earnings growth for in-house use or for clients. Innovations in earnings analysis over the last few years have included crowd-sourced earnings estimates (e.g. Estimize) and sentiment derived from the news and social media (e.g. RavenpackSocial Market Analytics). The overriding objective of company analysis has been and still is to forecast as accurately as possible a company’s future earnings and so guide asset allocation and trading decisions.

In this study, we looked at whether earnings announcement date revisions can be used for predicting future prices in a manner that could be profitably traded upon.

We reviewed the research papers of Joshua Livnat (http://www.wallstreethorizon.com/livnat) and Eric So (http://www.wallstreethorizon.com/So), both of whom look at whether changes in the earnings announcement dates can be used to generate returns.

Research Methodology

We conducted our study using our own research software, TimeBase and QuantOffice. In the webinar, we will show how this and other research can be back-tested in DeltixQuantHub, with full control over the parameters used in the study. Below are the steps we followed for this research:

  1. We populated TimeBase with WSH daily snapshots of company future earnings announcement dates for S&P500 stocks for the period January 3, 2006 to March 31, 2017.
  2. Corresponding with this time period, we also populated TimeBase with market data for those stocks. In production deployment of Deltix software, a time series of tick data is automatically recorded from real-time streaming market data. Whilst we used one-minute bar data for our research in this study, recording streaming tick data allows Deltix users to use any periodicity of data for their research.
  3. We now had a base data set, now with 11 years of data, on which to apply and test our ideas. Quant researchers use Deltix to express their model ideas as “strategies” in QuantOffice. Now with DeltixQuantHub, researchers will be able to publish their research to others and allow real interaction with the strategy in back-test.
  4. In the studies by Joshua Livnat and Eric So referenced above, both found that companies who advance their earnings dates generally outperform companies that delay their earnings dates. So we started with this premise and then developed the theme with advanced statistical techniques implemented in QuantOffice.
  5. The resulting model was back-tested, modified and back-tested iteratively. Again, in QuantHub, users will be able to run their own back-tests and test different parameter values.
  6. Where the earlier studies from Livnat and So modeled a holding period spanning from shortly after the change in date to the actual announcement, we took positions the day before an earnings announcement and sold them the day after, resulting on a holding period of less than 24 hours.
  7. In order to isolate the calendar date effect from any general market effect (although our holding period was less than a day), we also implemented a dollar-neutral version of the strategy. This was a simple extension to the trading strategy the effect of which can be isolated by a simple parameter.

Results

Our results supported the findings of the previous researchers. Specifically, we found:

  • The most likely positive returns occurred when the earnings announcement date was advanced (i.e. brought forward) in the second half of the quarter.
  • Conversely, the most probable negative returns occurred when earnings announcement date was delayed in the first half of the quarter.

For both hedged and un-hedged versions of the strategy, for the period January 2006 to March 2017, the back-tested strategies showed Sharpe Ratios of 1.96 (unhedged) and 1.99 (hedged) with average profit per share of 10 cents and 8 cents respectively. By comparison, in the prior study covering the period from January 2006 to September 2015, the back-tested strategies showed Sharpe Ratios of 2.08 (unhedged) and 2.12 (hedged). Average profit per share remained unchanged.

As such, we continue to conclude that there are profitable opportunities from trading with signals derived from WSH earnings date announcement data.

The full results are included in our research paper.

You can also review the prior study.

DeltixQuantHub

In our June 8th webinar, we demonstrated how our new DeltixQuantHub platform allows users to analyze research strategies, including changing parameter values and run an array of back-tests. In this way, non-technical users can interact with the trading strategy directly.

We are very excited by DeltixQuantHub as a means of connecting strategy designers, researchers and potential traders/portfolio managers who may or may not be in the same organization. You can register to view the webinar recording here.

To get more information about the products used to conduct this research, follow the links below:

TimeBase
QuantOffice
QuantServer

Deltix’s 2017 Mission and Roadmap

Our mission this year is driven by two observations:

Our Alpha Generation Software Has Reached Maturity

Observation 1: Our core product suite for the buy-side (QuantOffice and QuantServer) has reached the state of functional maturity.

The core functionality of QuantOffice and QuantServer has been in use for over 10 years now and battle tested by more than 200 clients. During the last few years, the number of requests for new features and enhancements from our clients has been gradually declining and we are happy and proud to acknowledge that based on our communications with clients and prospects, the latest release of QuantOffice and QuantServer software provides a mature balance of power and flexibility such that practically any functional requirement of intelligent trading can be addressed by utilizing the current capabilities.

Mission Accomplished

Of course, we are continuing to maintain and improve the product. In addition, there are always new trading venues and data providers that we continue to add. But we’re proud to announce that we’ve accomplished our mission to offer a solution for the ongoing process of:

  • Developing and testing new alpha generation strategies
  • Production deployment of these strategies
  • Constantly refining and optimizing existing trading strategies.

Increasing Demand for Optimizing Order Executions

Observation 2: We are witnessing an increasing demand, from both the buy-side and sell-side clients, for tools to design, back-test, optimize and deploy in production advanced order execution algorithms. We also see demand for advanced execution analytics and reporting.

Improving Execution in Forex and Futures

Execution algos have long been deployed in equities trading. Take up of algo execution of equities, particularly smart order routing, really picked up after the Reg NMS (US) and MIFID (Europe) regulations of some ten years ago. These regs created market fragmentation and obligations for best execution (soon to be strengthened in Europe by MIFID II). Credit Suisse led the way in introducing the buyside to various types of algos.

In the absence of these regulatory drivers, algo execution in the futures and forex markets lagged. Starting maybe two years ago, driven by the explosion in execution venues (forex) and generally poor trading returns (futures), algo execution has increasingly been adopted in forex and futures trading.

Today, whilst there are important differences between asset classes, buyside traders (and hence the sell-side service providers) across equities, futures and forex are looking for assistance in improving execution quality, for executions completed both with algos and without.

Continuous Improvement in Execution Quality

Improvement is typically based on measuring execution quality (traditionally called Transaction Cost Analysis – TCA). However, more important than ex post measurement is how to (continuously) improve execution. In essence, the task at hand is to choose the best method of execution for the particular purpose (and in the case of equities and forex, the best location). This may or may not require an execution algo but it does require a continuous deterministic approach to execution. The determination of the best execution method is, based on our discussions with our buy-side and sell-side clients, of fundamental importance because of the financial rewards of getting it right.

We recently discussed forex order execution in an article in e-Forex magazine. This article goes into detail about how data recording and advanced analytics can be used to collect and constantly evaluate time series data with all the layers of historical orders, executions and order books from multiple liquidity providers. This approach provides the necessary granularity and a level of precision in execution decisions not available in traditional trading systems. Click here to download the article.

Building Advanced Tools to Improve Order Execution

As a result of these observations, our focus in 2017 is to provide buy-side and sell-side clients with the most advanced tools to improve order execution. We view this as fundamentally an opportunity to deploy computer science with a clear measurable objective of decreasing execution costs. To address the demands of large-scale order execution optimization, we are rolling-out new technology for massive, cloud-based, distributed back-testing which will enable practically unlimited throughputs measured in billions of messages per second. To address the requirements for ultra-low latency and high availability, we are developing the next generation of our Execution Server technology.

Advanced Analytics Demands Highly Skilled Computer Scientists

Successful systematic alpha generation and order execution is not easy. It requires precision. Implementing precision in the context of massive fast-moving data sets requires highly advanced computer science which in turn requires highly skilled computer scientists. Which Deltix has, and we hired an additional 15 in 2016 bringing our headcount to 65. Over 50% of our headcount is in R&D.

A Recent Market Development Enabling Advanced Analytics

The CME’s recent launch of “Market by Order” (MBO), so called Level 3, data is a very exciting development which will allow for the ultimate in precision of order execution. Leveraging such data will require software of the highest order and we are looking forward to deploying our solution to utilize MBO data with early adopters.

We will continue to demonstrate improvements in execution, as well as alpha generation ideas, in our own research which we publish on this blog.

To an exciting and successful 2017.

Optimizing Order Execution Using Advanced Execution Analysis

Examining Execution Transaction Costs in LME Metals

Transaction Cost Analysis (TCA) has traditionally been used to examine costs of order executions between different brokers, demonstrate best execution and provide other compliance-based functions. More recent applications of TCA involve forensic analysis of executions. This deeper analysis helps firms improve execution quality in the context of a specific trading strategy. To reflect the deeper analysis and internal focus, we call such studies Advanced Execution Analysis (AXA).

Research Methodology

Over the summer, we embarked on a study with our client, the London-based broker, Marex Spectron. The purpose of the study was to examine different methods of executing orders of six metals on the London Metal Exchange (LME) using an Advanced Execution Analysis approach.

Using Arrival Price (mid-price between bid & offer) as the benchmark, the study looked at whether increasing passivity can reduce transaction costs versus a market order. In addition to market orders, we tested variations in passivity via four different types of limit orders:

  • Limit-Primary Places a limit order at the primary price level (bid for buy orders, offer for sell orders). If this order is not filled after 10 seconds, replaces the limit order with a market order. We use FIFO methodology for the order queue.
  • Limit-MidPrice Places a limit order at the mid-price of the bid and offer, rounding down for buys and up for sells. We assume that our order is the first at that price level. Again, if the order is not filled after 10 seconds, we replace the limit order with a market order.
  • Pegged-Primary Places a limit order at the primary price level and replaces on bid/offer updates. If the order is not filled over various time scales (from 1 second to 1 hour), it is replaced by a market order.
  • Pegged-MidPrice Places a limit order at the mid-price and replaces on bid/offer updates. Again, if the order is not filled over various time scales (from 1 second to 1 hour), it is replaced by a market order.

Marex Spectron provided LME tick data for the six metals for the period March 1 to May 13 2016. This data together with corresponding market depth (order book) data and volume profile data was loaded into Deltix TimeBase. The different execution methods described above were implemented in Deltix QuantOffice and back-tested against the market depth data. The following statistics were computed:

  • Transaction Cost (i.e. The difference between the order execution price and Arrival Price).
  • Standard Deviation of Transaction Costs
  • Average Time to Fill
  • Limit Order Fill %

Results

Defining risk as the standard deviation of transaction costs, we found that:

  • Across all evaluated strategies, the Market execution method has the highest expected transaction cost and the lowest risk associated with it.
  • When using peg intervals of short duration (up to 30-60 seconds depending on the market), the Pegged-MidPrice execution method provides both lower expected transaction costs and risk compared to the Pegged-Primary execution method.
  • Only Pegged-Primary (and not Pegged-MidPrice) demonstrates steady improvement of the transaction cost when using peg intervals of longer duration (above 60 seconds). However, this improvement comes at the cost of additional risk.

An example of results displayed graphically is shown below:

Advanced Execution Analysis - Aluminum Scatter Plot
Aluminum Scatter Plot
(Source: Marex, Deltix, LME)

The full study is available here.

Practical Considerations

As with any research, the results of one study are useful in providing general direction. However, the results would be significantly more useful if the research is repeated on an ongoing basis. This allows researchers to use market and trade data from different time periods, which will hopefully validate (or possibly repudiate) their results. At a minimum, ongoing research would help to refine their strategies.

For example, we are continuing this research by introducing price prediction heuristics into the pegged order execution methods. The goal is to reduce the risk of the Pegged-Primary method so we can capture the benefit of reduced transaction cost with less downside risk (i.e. standard deviation of transaction cost).

However, any price prediction technique is subject to the danger of curve-fitting. The on-going research approach described above extends the number and scope of out-of-sample testing periods to provide more reliable results.

Ideally, executions are analyzed in real-time. This way, researchers can analyze deviation of transaction costs versus benchmarks in the context of a rolling historical window (say from three months ago to real-time). With such information, traders can immediately identify divergences and execution anomalies compared to the performance of recent executions. This allows them to make necessary adjustments to ongoing executions or pending orders.

As we have discussed before, the holy grail is having fully adaptive execution algos which change their behaviour in real-time in response to real-time market data and actual performance. But the first step is to move from traditional TCA to ongoing detailed research and analysis of executions, so we can incorporate those findings into current trading decisions.

You can download this research study here.

More Insight on How To Do Execution Analysis

This research study is focused on advanced execution analysis rather than alpha generation. We’ll be publishing results of more execution research in 2017, so stay tuned.

In addition, over the summer, we published a couple of articles about approaches for doing advanced execution analysis. Here is a brief blog post discussing the advantages of recording your own market data for execution analysis. Here’s an article from Stuart Farr on DIY Execution Analysis published by CTA Intelligence.

If our research on signal generation is more relevant, you might find these research studies useful:

If you’d like to learn more about the platform used to conduct this research, visit our website or contact us.