Are PCs to blame for the financial mess we're in?

Virtual money crash
The first electronic exchange (NASDAQ) is already over 40 years old

During 6 May 2010, the Dow Jones Industrial Average - the second oldest US market index and one of the most commonly used indices to reflect the state of the market as a whole - saw its biggest and fastest decline ever.

By 2:42pm that day, the Dow had dropped 300 points since the start of trading that morning. By 2:47pm, a mere five minutes later, it had dropped by a further 600 points. Suddenly, hundreds of stocks had had their prices slashed to almost zero. The crash instantly wiped out almost $1trillion in stock value.

Even major blue chip companies weren't safe from its effects: Accenture shares fell more than 90 per cent from $40 to $0.01. Even more amazingly, just 20 minutes later the market had bounced back up to almost the same level it was at previously.

This major wobble in global economic markets sent shockwaves throughout the financial world, mainly because it wasn't at all clear what had caused it. A year on, and despite some lengthy investigations by the US Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC), it still isn't clear, but one likely causative factor is high frequency automatic algorithmic trading.

That's the buying and selling of shares not by market traders, but by financial software that automatically analyses the financial market and buy or sells shares based on that information at lightning speeds. The idea of a broker on the phone shouting, 'Buy, buy, sell, sell!' is far removed from how a lot of people are now trading on the stock market.

In some markets, more than half of the trading is done by computer programs, which use algorithms to decide whether to buy or sell stock. Generally, these are simple systems that are able to generate multiple small profits by systematically trading on electronic markets in the blink of an eye. The algorithms are usually written by proprietary trading firms and sold on to other traders.

Within the algorithm will be certain values that determine exactly when and where to trade, including the price of a security, the size of the fund available, timing considerations like how quickly an order can be executed, the best time to place an order to make sure it will be executed, how likely an order is to be filled and the overall cost of each transaction. Put simply, if the algorithm's conditions are met by a particular market scenario, the shares are traded automatically in a matter of milliseconds (and this speed is dropping all the time).

Remember that this process is entirely automatic, with no human involvement apart from what was laid down in the algorithm by the software's programmer. Some people are even buying these programs without knowing what the algorithms contain, merely trusting that they will do what they say they will do.

This is known as black box trading, which refers to the fact that you can't look into the black box to see how it works. Other firms create their own personalised algorithms, and these are evolving into much more complex programs over time as more and more information and safeguards are incorporated into their development.

The average lifetime of these trading algorithms tends to be extremely short - sometimes only a couple of days. So useful are these algorithms that they are becoming a very valuable commodity in themselves - in fact, there have already been two convictions in the US courts against people who have stolen proprietary code from these types of programmes worth millions of dollars.

The impact that these trading algorithms could have on economies was made worryingly clear by Assistant United States Attorney Joseph Facciponti when describing the case against former Goldman Sachs employee Sergey Aleynikov, who was convicted of such a theft earlier this year.

"The bank has raised the possibility that there is a danger that somebody who knew how to use this program could use it to manipulate markets in unfair ways," Facciponti said.

The dawn of automation

Clearly then, these algorithms have the power to affect the financial markets, but should we be worried about them? Don't they just provide another level of automation to make the traders lives easier? Surely they aren't actually causing any problems to the financial markets by themselves?

Andy Haldane, the Bank of England's Executive Director in charge of Financial Stability, isn't so sure. As he said in a speech on June 2011 in Beijing,

Andy haldane

"Driven by a potent cocktail of technology and regulation, trading in financial markets has evolved dramatically during the course of this century. Platforms for trading equities have proliferated and fragmented. And the speed limit for trading has gone through the roof. This rapidly changing topology of trading raises some big questions for risk management. There are good reasons, theoretically and empirically, to believe that while this evolution in trading may have brought benefits such as a reduction in transaction costs, it may also have increased abnormalities in the distribution of risk and return in the financial system. Such abnormalities hallmarked the Flash Crash."

So what effect does he feel that high frequency algorithmic trading has had?

"High frequency trading (HFT) has had three key effects on markets," explains Haldane. "First, it has meant ever-larger volumes of trading have been compressed into ever-smaller chunks of time. Second, it has meant strategic behaviour among traders is occurring at ever-higher frequencies. Third, it's not just that the speed of strategic interaction has changed but also its nature. Yesterday, interaction was human-to-human. Today, it's machine-to-machine, algorithm-to-algorithm. For algorithms with the lifespan of a ladybird, this makes for rapid evolutionary adaptation."

When Haldane talks about a massive change in the way financial markets are run, he's not joking. There really has been a change caused by the increase in the speed of what's possible - and it's come around fast. Certainly too fast for all the implications to be fully realised, as Haldane explains:

"The average speed of order execution on the US New York Stock Exchange has fallen from around 20 seconds a decade ago to around one second today." And that's just the average. As the use of high frequency trading increases, that average is going to decrease considerably.

As Haldane explains, electronic trading itself is fast approaching light speed - the speed limit of the universe.

"A decade ago, execution times on some electronic trading platforms dipped decisively below the one-second barrier. As recently as a few years ago, trade execution times reached 'blink speed' - as fast as the blink of an eye. At the time, that seemed eye-watering at around 300-400 milliseconds, or less than a third of a second, but more recently the speed limit has shifted from milliseconds to microseconds - millionths of a second. Several trading platforms now offer trade execution measured in microseconds.

"As of today, the lower limit for trade execution appears to be around 10 microseconds. This means it would, in principle, be possible to execute around 40,000 back-to-back trades in the blink of an eye. If supermarkets ran high frequency trading programmes, the average household could complete its shopping for a lifetime in under a second. Imagine that."

The rise of HFT

Of course, the fact that HFT allows you to do a lot of trades quickly isn't necessarily a problem, but we do need to consider how much trading is being done automatically, without human intervention. The fear here is that soon the basis of the financial markets may change.

If investors start to believe that stocks aren't valued on a company's expected future earnings, but actually get their value from computers trading against other computers for speed and advantage, then the financial markets will seem like too much of a gamble to invest in. So how much trading nowadays is being done solely by computers?

Unsurprisingly, with the edge it offers over other traders in terms of speed and even knowledge, it is steadily increasing. For example, as Haldane explains, "In Europe, since 2005, HFT has risen from a tiny share to represent over 35 per cent of the equity market. In Asia and in emerging markets, it's growing fast from a lower base. What's true across countries is also true across markets.

"HFT is assuming an ever-increasing role in debt and foreign exchange markets. In some futures markets, it already accounts for almost half of turnover. In the space of a few years, HFT has risen from relative obscurity to absolute hegemony, at least in some markets."

Changing the game plan

That's not all algorithmic trading has done - it's also changed the way the financial game is played. In the past - a mere decade ago, before computer automation - traders and big financial institutions would make money by dealing in reasonably simple instruments: bonds, equities and the foreign exchange.

As automation kicked in, the profit margins on these trades were subsequently driven down. This had the knock-on effect of encouraging these firms to reduce their own costs and find competitive advantage by automating as many of their simple trades as possible.

Whereas in the past there would be hundreds of traders buying or selling small amounts of shares in an effort not to tip off the competition as to what they were doing, now it could be handled by computers, gradually buying or selling small numbers of shares over controlled time periods.

The massive increase in speed provided by these automatic processes also allowed the big financial houses to see other places where algorithms could be employed to generate wealth, places where the split second timing could allow inefficiencies in financial reporting to be exploited.

One example is arbitrage opportunities. Here, assets that are basically identical are bought and sold at the same time to make a profit from a price difference between the two assets. So if the same stock is listed at £20 on the New York Stock Exchange and £19.95 on the Philadelphia Stock Exchange, you can guarantee yourself a profit by buying stock on the Philadelphia stock exchange and then selling it straight away on the New York Stock Exchange (provided the difference offsets any dealing costs).

With algorithmic trading, even if those differences only appear for a split second, the financial houses can act on them. It's this form of algorithmic trading that has some regulators worried - they think that when searching for volatile markets to exploit, these forms of trades could be adding to the overall volatility of the market themselves.In actively seeking out these imperfections, they exacerbate their effect on the markets.

This is the news

Another useful trick that algorithmic trading can employ is to automatically scan incoming financial news feeds for buzz words or phrases like 'interest rate hike', for example. When the computers detect these they can instantly trigger the same dealing strategies that were employed when a similar situation occurred in the past.

This instant reaction allows firms to get the drop on their competitors and can prove very profitable. This is becoming very popular with the larger firms, and news feed providers have even changed the structure of their feeds so they can be more easily read by computers.

The problem here though is that if you have a lot of computer programs reacting in a very similar way as soon as news is produced, it can cause a bad reaction on the markets - especially if this then triggers further rounds of automatic trading based on the previous automatic trading. This is a major shift in the way the financial world works.

Suddenly, it isn't humans absorbing the news and acting on it based on experience. Instead, it's just a computer going through a preset process automatically. Clearly, if this form of algorithmic trading becomes endemic, the future of some stocks and shares will rely not on how the companies are performing, but how that performance is reported in the news.

Also, imagine the problems that could occur if the news writer got it wrong. In the fast-paced world of financial news reporting, it isn't unimaginable that a mistake could be made, if such a mistake could now automatically create a massive financial black hole, it isn't surprising that there are calls for regulation on how algorithmic trading can be used.

Running to stand still

Automatic trading has had another effect on the market - it's increased its load (the number of deals that are being done daily). The average trade size on equity markets in 1996 was over 1,600 shares, in the past decade it has decreased to a mere 400 shares per trade. That means there are four times as many trades needed to deliver the same volume as 10 years ago.

This fourfold increase in trade data means that when something goes wrong it's a lot harder to untangle what caused what when, especially as these trades are very hard to track back to particular companies. This is why it's so hard to lay the blame for the 2010 Flash Crash at any particular door - there just isn't the paper trail needed to work out what happened.

It's likely that some regulation may be introduced, allowing an audit trail to be seen clearly, but despite encouraging noises from regulators this doesn't seem to be happening soon.

This extra load has another very important consequence - the diminishment of risk management. Nowadays risk management is taking a back seat as it is almost impossible to consolidate the amount of data being generated from high-frequency trading. Number crunching can't keep up with the uncontrolled automation, which means that it is increasingly difficult to judge the risk of an action on the markets.

Hidden trading?

It's not just confusion over risk either - the fact that most people can't see what factors the algorithms they use for trading contain has, some argue, led to a lack of transparency in the markets.

Obviously, the people who create the algorithms need to protect their intellectual property, and so provide closed systems to the program buyers, but this means some smaller traders using HFT aren't even totally sure what the algorithms they are using for trading contain. This also means that government regulators can't see what is happening inside these black boxes either.

Some say this allows traders to hide abusive activity and they should have to show what they are doing to regulators. Traders point out that these algorithms are increasingly becoming their life-blood, and sharing the secrets of their own proprietary code would give others an unfair advantage.

Of course, this means that this type of black box trading impinges on one of the biggest facets regulators are keen to have in the marketplace - visibility. This is compounded by the fact that those high-frequency traders with the better algorithms have access to better information than other traders, and they can make the deal microseconds before anyone else. Where do these massive advantages offered by the latest algorithms fit in if markets are meant to be level playing fields?

One of the biggest questions about algorithmic trading is just how profitable it is. There are definitely big bucks being generated by established firms, but what about the newcomers who are trying to take advantage of this new type of trading?

Some industry insiders are already speculating that the profitable days of algorithmic trading are already in the past (although this means profits of merely millions as opposed to billions). It seems that increasing competition in the high frequency trading world means it's becoming harder and harder to find those little inconsistencies that provided so much profit at the introduction of this new way of trading.

Trading volumes have since slumped, as has volatility, meaning algorithmic trading has nothing to take advantage of. Of course speed is a very useful factor offered by automatic trading, but now it seems, you need to have algorithmic trading just to keep pace with the rest of the field.