Printer Friendly

Algorithms in volatility.

Summary paragraph: Richard Hills, Global Head of electronic services, Societe Generale, explains how algorithms should be built to handle volatile markets.

Richard Hills, Global Head of electronic services, Societe Generale, explains how algorithms should be built to handle volatile markets.

Volatility is heading back to the levels of 2008/9 when the VIX traded for eight months above an average of 48%. Since June 2009, the VIX has averaged 22, peaking above 30 for six brief episodes. Over the last two months, the VIX has averaged 33% and peaked at 48. The Eurostoxx 50 finished down on six out of seven weeks in July and August, with drops of 11% and 6% and average intra week change of 5%.

It is important to be aware of the factors that contribute to poor performance during high volatility and how you can account for them when you build an algorithm.

Factors to consider

The first factor is that spreads diverge from historical norms. UK spreads doubled from 10 to 20 bps in July, versus their sixmonth average. German spreads doubled and French spreads tripled. Thus, for every aggressive order, you pay more than double the cost of a passive order. Assuming the ratio between aggressive and passive orders is 30%, you will pay 30,000? more on a basket of ?100 million notional.

This highlights the need to be careful when the algorithm crosses the spread and when it quotes.

This leads me to the second factor, which is that volume profiles do not fit their historical norms. This is important for algorithms that use historical volume curves or dynamic order book volumes. If there is an unanticipated surge in volume, the algorithm may assume it is falling behind schedule and try to catch up by aggressing the opposite side of the order book, with the associated cost implied.

The third factor to consider is the potential "clean up" cost of a passive order if there is an adverse price movement whilst the algorithm is quoting. The algorithm has to catch up on volume at less favourable market prices or remain passive and hope that the market will revert and risk residuals.

The fourth issue is that the frequency of changes to the best bid offer price increases, which also shortens the queue on the touch and reduces the probability of a quote being lifted. It results in a thinning of the average touch quantity versus its historical norms and the algorithm must be capable of adjusting its order sizes and re-posting tempos accordingly.

Compounding this factor is increased fragmentation across venues. Smart routers have to work harder to repost order price and quantity across multiple order books. There is a high disparity between the performance of the market's various smart routers and therefore arbitrage opportunities open up between venues.

These five factors, and there are many more, are always present but our challenge is that they are accentuated in conditions of high volatility.

Solving the problem

So, how does SG CIB build algorithms to cater for these conditions? We employ statistical arbitrage techniques in our client service algorithms that use fair value calculations to determine whether a share price is developing favourable or adverse short-term momentum.

The calculations are derived from indicators that look at relative movements of share prices versus their sector, index, future, and other factors, such as the balance of the market between buy and sell orders. This "relative valuation" technique is inherited from our statistical arbitrage businesses which have been successfully building algorithms to take advantage of volatile markets for over a decade and has been embedded in our client service algorithms.

Using fair value, the algorithm takes a macro view of price and adjusts participation levels according to price direction and client risk appetite, thus shielding the order from adverse effects. Thus, if a stock is trading at fair value relative to the indicators, the algorithm participates at a standard level. However, if the stock is favourably mis-priced, the algorithm increases participation. It is intelligent about either paying or capturing the spread and whether to ignore shortterm widening of the spread or decreasing tick frequency. Our 'relative value' algorithm, available to clients in our standard suite of algorithms, emphasises these techniques.
COPYRIGHT 2011 Asset International, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Publication:The Trade
Date:Sep 1, 2011
Previous Article:Light at the end of the tunnel.
Next Article:Algos adapt to volatility.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters