About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

High Volume, Low Anxiety

Subscribe to our newsletter

In the time that it takes you to read this sentence, approximately 10,000 equity orders and quotes will be sent to the main European equity venues. Approximately 75% of those messages, according to the Financial Times, will be high-frequency trades: short-term positions held for a period of milliseconds or, at most, seconds, and every one of them automatically generated and tracked by a computer program.

This trend is not slowing. By some estimates, the overall volume growth of these types of trades has reached 50% year over year. With numbers like these, it is clear that the stock market of the future is, by and large, a world of automation. Increasingly, the role of humans will be relegated to the inception, oversight and optimisation of the programs that generate and execute these transactions.

For those prone to Orwellian paranoia, fear not: Organisations that survive the transition will enjoy more opportunities to increase revenue with lower risk. But to get there, financial services organisations will need to spend the next several years investing in real-time monitoring and risk control systems that can handle the analysis of such high-volume data.

Upgrading for Speed and High Volume

Today, the pressure is on companies to track, analyse and derive usable intelligence from the vast information flow that high-frequency trading leaves in its wake. To do that, they must act quickly to upgrade their technology infrastructure and analytical systems.

These upgrades primarily fall into three categories:

· Larger and faster hardware that can perform the necessary analysis with low latency, allowing trading signals to be generated back into the order execution platforms.

· Larger storage arrays to maintain the tremendous volume of trading history that high-frequency trading creates and also relies upon for the back testing of new strategies.

· Novel software systems capable of real-time risk control and compliance monitoring.

Upgrading processing power and storage capacity are critical to the equation, but these endeavors are straightforward enough. Risk control and compliance, on the other hand, require a willingness to challenge the status quo and reevaluate the way the company handles data monitoring from the ground up. Resistance to change and associated cultural entrenchment are typical at this level of transformation. Fortunately, the size of the opportunity will prevent most firms from such distractions.

This new breed of technology is appropriately coined complex event processing (CEP), as it aggregates vast stores of data and uses many complex events as a source of best-possible guidance toward revenue optimization and risk reduction. It won’t be long before CEP is found everywhere in the financial services industry and beyond. In five years’ time, the way that capital markets firms manage risk will look nothing like it does today, and high-frequency trading will have been a primary factor in the transformation.

Managing Risk Through Complex Events

The development of new high-frequency trading strategies takes time, and yet the shelf life for these strategies can be very short – weeks or even days. The strategies must be continually optimized, and the best way of accomplishing that is to provide automation: a platform that enables those strategies to be added, updated and switched in or out according to market conditions. By making it easy to develop new and improved trading algorithms and providing a robust historical model against which to back test them, CEP helps quants—the men and women who develop trading algorithms – be more effective and productive.

It isn’t just quants who benefit. What CEP really provides is a single, comprehensive and up-to-date view of the truth, delivered precisely when it’s needed. By aggregating data from multiple sources – for example, merging the latest market risk and credit risk views with the latest movements of the market – CEP provides much better insight into trading activity as it happens.

Critical characteristics of CEP include the following:

· The application of complex, sophisticated logic to disparate streams of data

· The capability to process extreme data flow volumes in real time

· Support for the rapid deployment and evolution of new applications without involving back-office personnel

· Low latency for every tick that traverses the system

Putting on the Brakes

Not surprisingly, the sheer volume of automated trades has not gone unnoticed by regulators. Regulators in the U.S. found that high-frequency trading exacerbated last year’s Flash Crash, but regulators stopped short of fingering high-frequency trading as the primary cause. In Europe there is talk of an effort to suppress a percentage of high-frequency trading.

Such rumblings are unlikely to result in actions that would significantly alter the current trajectory; however, if more stringent regulations are indeed on the way, the role of CEP in the future of the financial services industry will only get bigger. CEP is ideally positioned to allow both trading firms and the exchanges to rapidly comply with those regulations.

Regardless of what regulators may decide about high-frequency trading, there’s no doubt that CEP is the right choice to lower risk in our high-volume world.

Nick Deacon is Senior Director, EMEA Sales, at Sybase, an SAP Company.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Re-architecting the trading platform for interoperability, resilience and profitability

Trading platforms have come a long way since the days of exchanging paper certificates and shouting across trading floors, pits and desks in the early 2000s, but there is progress still to be made as firms strive to reduce risk, increase profitability, and make their mark in digital assets trading. This webinar will review the...

BLOG

QuantHouse Enhances Data Offering with BMLL Historical Order Book Data

QuantHouse, the API Data and Trading Solutions business of Iress, has entered into a new global partnership with BMLL, the independent provider of harmonised Level 3, 2 and 1 historical data and analytics for the equity and futures markets. This collaboration aims to merge QuantHouse’s real-time data services with BMLL’s comprehensive historical order book data,...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Risk & Compliance

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements. Data management is...