Old news for pros, a good read for everybody. This piece is from 2011, I'll link to the current issue after the jump.
From the London Review of Books:
What goes on in stock markets appears quite different when viewed on different timescales. Look at a whole day’s trading, and market participants can usually tell you a plausible story about how the arrival of news has changed traders’ perceptions of the prospects for a company or the entire economy and pushed share prices up or down. Look at trading activity on a scale of milliseconds, however, and things seem quite different.HT: History Squared
When two American financial economists, Joel Hasbrouck and Gideon Saar, did this a couple of years ago, they found strange periodicities and spasms. The most striking periodicity involves large peaks of activity separated by almost exactly 1000 milliseconds: they occur 10-30 milliseconds after the ‘tick’ of each second. The spasms, in contrast, seem to be governed not directly by clock time but by an event: the execution of a buy or sell order, the cancellation of an order, or the arrival of a new order. Average activity levels in the first millisecond after such an event are around 300 times higher than normal. There are lengthy periods – lengthy, that’s to say, on a scale measured in milliseconds – in which little or nothing happens, punctuated by spasms of thousands of orders for a corporation’s shares and cancellations of orders. These spasms seem to begin abruptly, last a minute or two, then end just as abruptly.
Little of this has to do directly with human action. None of us can react to an event in a millisecond: the fastest we can achieve is around 140 milliseconds, and that’s only for the simplest stimulus, a sudden sound. The periodicities and spasms found by Hasbrouck and Saar are the traces of an epochal shift. As recently as 20 years ago, the heart of most financial markets was a trading floor on which human beings did deals with each other face to face. The ‘open outcry’ trading pits at the Chicago Mercantile Exchange, for example, were often a mêlée of hundreds of sweating, shouting, gesticulating bodies. Now, the heart of many markets (at least in standard products such as shares) is an air-conditioned warehouse full of computers supervised by only a handful of maintenance staff.
The deals that used to be struck on trading floors now take place via ‘matching engines’, computer systems that process buy and sell orders and execute a trade if they find a buy order and a sell order that match. The matching engines of the New York Stock Exchange, for example, aren’t in the exchange’s century-old Broad Street headquarters with its Corinthian columns and sculptures, but in a giant new 400,000-square-foot plain-brick data centre in Mahwah, New Jersey, 30 miles from downtown Manhattan. Nobody minds you taking photos of the Broad Street building’s striking neoclassical façade, but try photographing the Mahwah data centre and you’ll find the police quickly taking an interest: it’s classed as part of the critical infrastructure of the United States.
Human beings can, and still do, send orders from their computers to the matching engines, but this accounts for less than half of all US share trading. The remainder is algorithmic: it results from share-trading computer programs. Some of these programs are used by big institutions such as mutual funds, pension funds and insurance companies, or by brokers acting on their behalf. The drawback of being big is that when you try to buy or sell a large block of shares, the order typically can’t be executed straightaway (if it’s a large order to buy, for example, it will usually exceed the number of sell orders in the matching engine that are close to the current market price), and if traders spot a large order that has been only partly executed they will change their own orders and their price quotes in order to exploit the knowledge. The result is what market participants call ‘slippage’: prices rise as you try to buy, and fall as you try to sell.
In an attempt to get around this problem, big institutions often use ‘execution algorithms’, which take large orders, break them up into smaller slices, and choose the size of those slices and the times at which they send them to the market in such a way as to minimise slippage. For example, ‘volume participation’ algorithms calculate the number of a company’s shares bought and sold in a given period – the previous minute, say – and then send in a slice of the institution’s overall order whose size is proportional to that number, the rationale being that there will be less slippage when markets are busy than when they are quiet. The most common execution algorithm, known as a volume-weighted average price or VWAP algorithm (it’s pronounced ‘veewap’), does its slicing in a slightly different way, using statistical data on the volumes of shares that have traded in the equivalent time periods on previous days. The clock-time periodicities found by Hasbrouck and Saar almost certainly result from the way VWAPs and other execution algorithms chop up time into intervals of fixed length.
The goal of execution algorithms is to avoid losing money while trading. The other major classes of algorithm are designed to make money by trading, and it is their operation that gives rise to the spasms found by Hasbrouck and Saar....MORE
Also at the LRB: