About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

High Volume, Low Anxiety

Subscribe to our newsletter

In the time that it takes you to read this sentence, approximately 10,000 equity orders and quotes will be sent to the main European equity venues. Approximately 75% of those messages, according to the Financial Times, will be high-frequency trades: short-term positions held for a period of milliseconds or, at most, seconds, and every one of them automatically generated and tracked by a computer program.

This trend is not slowing. By some estimates, the overall volume growth of these types of trades has reached 50% year over year. With numbers like these, it is clear that the stock market of the future is, by and large, a world of automation. Increasingly, the role of humans will be relegated to the inception, oversight and optimisation of the programs that generate and execute these transactions.

For those prone to Orwellian paranoia, fear not: Organisations that survive the transition will enjoy more opportunities to increase revenue with lower risk. But to get there, financial services organisations will need to spend the next several years investing in real-time monitoring and risk control systems that can handle the analysis of such high-volume data.

Upgrading for Speed and High Volume

Today, the pressure is on companies to track, analyse and derive usable intelligence from the vast information flow that high-frequency trading leaves in its wake. To do that, they must act quickly to upgrade their technology infrastructure and analytical systems.

These upgrades primarily fall into three categories:

· Larger and faster hardware that can perform the necessary analysis with low latency, allowing trading signals to be generated back into the order execution platforms.

· Larger storage arrays to maintain the tremendous volume of trading history that high-frequency trading creates and also relies upon for the back testing of new strategies.

· Novel software systems capable of real-time risk control and compliance monitoring.

Upgrading processing power and storage capacity are critical to the equation, but these endeavors are straightforward enough. Risk control and compliance, on the other hand, require a willingness to challenge the status quo and reevaluate the way the company handles data monitoring from the ground up. Resistance to change and associated cultural entrenchment are typical at this level of transformation. Fortunately, the size of the opportunity will prevent most firms from such distractions.

This new breed of technology is appropriately coined complex event processing (CEP), as it aggregates vast stores of data and uses many complex events as a source of best-possible guidance toward revenue optimization and risk reduction. It won’t be long before CEP is found everywhere in the financial services industry and beyond. In five years’ time, the way that capital markets firms manage risk will look nothing like it does today, and high-frequency trading will have been a primary factor in the transformation.

Managing Risk Through Complex Events

The development of new high-frequency trading strategies takes time, and yet the shelf life for these strategies can be very short – weeks or even days. The strategies must be continually optimized, and the best way of accomplishing that is to provide automation: a platform that enables those strategies to be added, updated and switched in or out according to market conditions. By making it easy to develop new and improved trading algorithms and providing a robust historical model against which to back test them, CEP helps quants—the men and women who develop trading algorithms – be more effective and productive.

It isn’t just quants who benefit. What CEP really provides is a single, comprehensive and up-to-date view of the truth, delivered precisely when it’s needed. By aggregating data from multiple sources – for example, merging the latest market risk and credit risk views with the latest movements of the market – CEP provides much better insight into trading activity as it happens.

Critical characteristics of CEP include the following:

· The application of complex, sophisticated logic to disparate streams of data

· The capability to process extreme data flow volumes in real time

· Support for the rapid deployment and evolution of new applications without involving back-office personnel

· Low latency for every tick that traverses the system

Putting on the Brakes

Not surprisingly, the sheer volume of automated trades has not gone unnoticed by regulators. Regulators in the U.S. found that high-frequency trading exacerbated last year’s Flash Crash, but regulators stopped short of fingering high-frequency trading as the primary cause. In Europe there is talk of an effort to suppress a percentage of high-frequency trading.

Such rumblings are unlikely to result in actions that would significantly alter the current trajectory; however, if more stringent regulations are indeed on the way, the role of CEP in the future of the financial services industry will only get bigger. CEP is ideally positioned to allow both trading firms and the exchanges to rapidly comply with those regulations.

Regardless of what regulators may decide about high-frequency trading, there’s no doubt that CEP is the right choice to lower risk in our high-volume world.

Nick Deacon is Senior Director, EMEA Sales, at Sybase, an SAP Company.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: High-Performance Networks & Low-Latency Connectivity for Trading

With financial markets becoming more complex and interconnected in today’s electronic trading environment, trading firms, exchanges, and infrastructure providers need to continually push the boundaries of network performance to stay ahead. Ultra-low latency, seamless connectivity, and resilient infrastructure are no longer just advantages – to stay competitive, they’re necessities. This webinar, part of the A-Team...

BLOG

TMX Group Acquires Verity to Expand Global Investment Data and Analytics Offering

TMX Group has acquired Verity, a provider of buy-side investment research management systems, data, and analytics. The deal strengthens the capabilities of TMX Datalinx, the company’s information services division, by broadening its offering across equities, fixed income, and private assets. Verity’s core products include VerityRMS, a research management system, and VerityData, which delivers datasets and...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Impact of Derivatives on Reference Data Management

They may be complex and burdened with a bad reputation at the moment, but derivatives are here to stay. Although Bank for International Settlements figures indicate that derivatives trading is down for the first time in 10 years, the asset class has been strongly defended by the banking and brokerage community over the last few...