High Frequency Frisbee Throwing

In his challenging speech to the US Federal Reserve Bank’s recent Jackson Hole conference, Executive Director for Financial Stability at the Bank of England Andrew Haldane publicly questioned the trend towards increasingly complex regulation. Haldane argued instead that complex environments and tasks – such as catching a Frisbee, or a financial crisis – may be better managed through simpler rules.

Haldane’s hypothesis boils down to the idea that the complexity of the modern financial regulatory system exposes regulators to cognitive stress and data overload, reducing the quality of their decision-making, whilst also imposing unnecessary regulatory reporting burdens on institutions. As a result, complex regulation benefits neither the regulators nor the regulated – or the public who are ultimately affected by regulatory failure. Instead he argues that simpler regulatory frameworks, better applied by fewer but more experienced regulators, may be the solution to catching crises before they explode.

Although Haldane applied this view in his speech to a discussion of the complex and data-heavy Basel regulatory framework, it may also have a role to play in illuminating possible solutions for other complex regulatory problems. The explosion of algorithmic trading on international and domestic markets in recent years is one such complex problem currently being grappled with by regulators worldwide. Although algorithms have been a feature of trading for many years, their significance has increased in recent years due to the emergence of high frequency trading (HFT), which uses algorithms to implement trading strategies that may involve holding positions on a market for a matter of milliseconds – without any human intervention.

Current estimates suggest that HFT accounts for approximately 70% of trades in the United States, 35-40% of trades in Europe, and 20-25% of trades in Australia, although it comprises 50% of trades made on the new Chi-X market.  Yet it is not only the dominance of HFT in some jurisdictions that has prompted increased regulatory concern, but also events such as the May 6 2010 ‘Flash Crash’ and 1 August 2012 Knight Capital ‘technology breakdown’. In both cases, algorithmic trading programs triggered flow on trading that destabilised the prices of individual securities and entire markets. In the case of the ‘Flash Crash’, one sell order triggered more than 20,000 trades in more than 300 securities that occurred at prices more than 60% away from their previous levels – causing the Dow Jones to record its biggest one day intraday decline ever.

However, efforts to regulate this area have been hampered by difficulties in defining exactly what is meant by HFT, given that the term is used to describe a range of trading strategies. As IOSCO’s Report on Regulatory Issues Raised by the Impact of Technological Changes on Market Integrity and Efficiency argues, each of these strategies have “a different market impact and hence [raise] different regulatory issues”. Furthermore, the ‘technological arms race’ being engaged in by HFT firms means that regulators’ ability to understand, let alone regulate the impact of HFT or algorithmic trading has been significantly hindered by the constant evolution of practices in this field.

The scarcity of empirical research into this area has only magnified regulators’ struggle to implement appropriate policies, particularly given the heated debate continues to rage over HFT’s impact on a market. Whereas its supporters argue that HFT increases market liquidity, its detractors contend that it adversely affects market efficiency by moving market prices away from the fundamental value of securities traded. Similarly, market participants and commentators are divided on whether the ‘technological arms race’ engaged upon by HFT firms compromises market fairness and integrity by driving away less sophisticated retail investors. As such, regulators face not only the challenge of designing regulatory policy to appropriately govern a constantly evolving, largely undefinable, technology-driven practice, but doing so in an environment of heated debate with little objective research to support their decisions. Given the complexity and uncertainty created by these factors, the appeal of Haldane’s hypothesis to regulators grappling with algorithmic and HFT would seem undeniable.

Although Haldane’s hypothesis has only been articulated relatively recently, the way it might be applied in this area is hinted at by the recent policies suggested by the Australian Securities and Investments Commission (ASIC). The difficulties ASIC has experienced in appropriately regulating these trading technologies is hinted at by the fact that their most recent consultation paperConsultation Paper 184: Australian market structure: Draft market integrity rules and guidance on automated trading (CP 184) – is the fourth paper on this topic in the last two years.

Nevertheless, the policies put forward in CP 184 would seem to be a step in the right direction, at least under Haldane’s theory of regulating complexity through simplicity. Given the uncertainty as to HFT’s effects on market fairness, efficiency and integrity, as opposed to the readily observable consequences of ‘rogue algorithms’ combined with HFT, it is unsurprising that the policies implemented by ASIC are largely directed at minimising the chance of a ‘flash crash’ occurring on an Australian market. To achieve this, ASIC has designed what can essentially be described as two ‘tiers’ of regulation. The first ‘tier’, which is yet to be finalised, is intended to be imposed on market participants through the amendment of Parts 5.5-5.7 in the ASIC Market Integrity Rules (ASX Market) and ASIC Market Integrity Rules (Chi-X Australia Market). These rules attempt to reduce the likelihood of erroneous automated trades by imposing responsibility for ensuring the quality of the algorithms driving automated trades, and imposing stiff penalties on participants who fail to ensure that their trades do not adversely affect the market. The second ‘tier’ of regulation, imposed on market operators, has already been implemented through amendments to ASIC Market Integrity Rules (Competition in Exchange Markets). This layer of regulation essentially functions as a back-up to the obligations imposed on market participants, in that it requires market operators to have in place ‘anomalous order thresholds’ and ‘extreme cancellation ranges’. The practical effect of these requirements is to require Australian market operators to have ‘kill switches’ in place to deal with extreme volatility caused by ‘rogue’ algorithms that make it through market participants’ internal regulation.

It may be the case that the actual regulatory framework introduced by ASIC, and in particular the compliance obligations imposed on market participants and operators, may not be so ‘simple’. No doubt those charged with the task of demonstrating compliance may not think so. But the fundamental idea of governing this area through a combination of quality control mechanisms and backup kill switches to mitigate the effects of any failure in those mechanisms – is, at its heart, a comparatively simple one. In this way, ASIC’s efforts to regulate automated trading may be seen as suggesting that Haldane’s hypothesis that complexity should be governed by simplicity has an important role to play beyond his application of it to the Basel reforms. 

Add new comment