In the time that it takes you to read this sentence, approximately 10,000 equity orders and quotes will be sent to the main European equity venues. Approximately 75% of those messages, according to the Financial Times, will be high-frequency trades: short-term positions held for a period of milliseconds or, at most, seconds, and every one of them automatically generated and tracked by a computer program.
This trend is not slowing. By some estimates, the overall volume growth of these types of trades has reached 50% year over year. With numbers like these, it is clear that the stock market of the future is, by and large, a world of automation. Increasingly, the role of humans will be relegated to the inception, oversight and optimisation of the programs that generate and execute these transactions.
For those prone to Orwellian paranoia, fear not: Organisations that survive the transition will enjoy more opportunities to increase revenue with lower risk. But to get there, financial services organisations will need to spend the next several years investing in real-time monitoring and risk control systems that can handle the analysis of such high-volume data.
Upgrading for Speed and High Volume
Today, the pressure is on companies to track, analyse and derive usable intelligence from the vast information flow that high-frequency trading leaves in its wake. To do that, they must act quickly to upgrade their technology infrastructure and analytical systems.
These upgrades primarily fall into three categories:
· Larger and faster hardware that can perform the necessary analysis with low latency, allowing trading signals to be generated back into the order execution platforms.
· Larger storage arrays to maintain the tremendous volume of trading history that high-frequency trading creates and also relies upon for the back testing of new strategies.
· Novel software systems capable of real-time risk control and compliance monitoring.
Upgrading processing power and storage capacity are critical to the equation, but these endeavors are straightforward enough. Risk control and compliance, on the other hand, require a willingness to challenge the status quo and reevaluate the way the company handles data monitoring from the ground up. Resistance to change and associated cultural entrenchment are typical at this level of transformation. Fortunately, the size of the opportunity will prevent most firms from such distractions.
This new breed of technology is appropriately coined complex event processing (CEP), as it aggregates vast stores of data and uses many complex events as a source of best-possible guidance toward revenue optimization and risk reduction. It won’t be long before CEP is found everywhere in the financial services industry and beyond. In five years’ time, the way that capital markets firms manage risk will look nothing like it does today, and high-frequency trading will have been a primary factor in the transformation.
Managing Risk Through Complex Events
The development of new high-frequency trading strategies takes time, and yet the shelf life for these strategies can be very short – weeks or even days. The strategies must be continually optimized, and the best way of accomplishing that is to provide automation: a platform that enables those strategies to be added, updated and switched in or out according to market conditions. By making it easy to develop new and improved trading algorithms and providing a robust historical model against which to back test them, CEP helps quants—the men and women who develop trading algorithms – be more effective and productive.
It isn’t just quants who benefit. What CEP really provides is a single, comprehensive and up-to-date view of the truth, delivered precisely when it’s needed. By aggregating data from multiple sources – for example, merging the latest market risk and credit risk views with the latest movements of the market – CEP provides much better insight into trading activity as it happens.
Critical characteristics of CEP include the following:
· The application of complex, sophisticated logic to disparate streams of data
· The capability to process extreme data flow volumes in real time
· Support for the rapid deployment and evolution of new applications without involving back-office personnel
· Low latency for every tick that traverses the system
Putting on the Brakes
Not surprisingly, the sheer volume of automated trades has not gone unnoticed by regulators. Regulators in the U.S. found that high-frequency trading exacerbated last year’s Flash Crash, but regulators stopped short of fingering high-frequency trading as the primary cause. In Europe there is talk of an effort to suppress a percentage of high-frequency trading.
Such rumblings are unlikely to result in actions that would significantly alter the current trajectory; however, if more stringent regulations are indeed on the way, the role of CEP in the future of the financial services industry will only get bigger. CEP is ideally positioned to allow both trading firms and the exchanges to rapidly comply with those regulations.
Regardless of what regulators may decide about high-frequency trading, there’s no doubt that CEP is the right choice to lower risk in our high-volume world.
Nick Deacon is Senior Director, EMEA Sales, at Sybase, an SAP Company.
Subscribe to our newsletter