About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Market Surveillance Requires Systems that are Lean, Fast and Flexible

Subscribe to our newsletter

In a highly regulated trading market that is growing in complexity, yet must prove transparency, market surveillance is a significant challenge – but it can be addressed using technologies that span systems and data management, and deliver real-time alerts of potential abuse in the market.

The pressing issues of surveillance and how best to tackle them in practical terms were debated in an A-Team Group webinar entitled ‘The Data Challenge in Market Surveillance’. Andrew Delaney, editor-in-chief at A-Team Group, set the scene for discussion, describing the disciplines that play into market surveillance, including data management, risk and compliance, and big data. He also noted the drivers of surveillance, such as the growing complexity of financial institutions’ trading and
investment activities, mainstream acceptance of high-speed trading techniques, broad desire for more regulatory oversight, and difficulties in tracking vast quantities of trading data to ensure compliance with rules and regulations.

Against this backdrop he invited panel members to comment on the importance of market surveillance and how it can best be implemented. Rob Hodgkinson, director of exchange compliance and surveillance practice at First Derivatives, said: “Market surveillance responds to the need to ensure that markets are fair and transparent. It can strengthen markets, attract liquidity and narrow spreads, so prices are better and there is more transparency. If someone is trading in a market where abuse is known, the investor is likely to stay away. The situation is similar for brokers. If they feel a price is not fair, this is a deterrent to trading, which is not what the market wants.”

Describing the shift of surveillance into the front office, Gerry Buggy, executive at First Derivatives, said the only way to prove the problem of what is market colour and what is insider trading is to use real-time surveillance. He explained: “The front office has the technology to provide the data for this and can replay it, but if incorrect orders are placed on the market a real-time problem is caused.”

Russell Acton, vice president of the EMEA region at Pivotal, a company spun out of EMC and VMWare early this year to provide a data analysis platform-as-a-service solution that replicates the IT operations of firms such as Google and Amazon, said the surveillance problem is getting larger and more extreme. He suggested: “Techniques used across other industries could be brought into financial services. It is now possible, economically, at scale and by unloading the problem onto machines, to look for patterns and activity in a way that couldn’t be done before.”

With some of the issues around surveillance established, Delaney turned to the role of regulation in driving it forward. Hodgkinson, who is based in Sydney and is working with the Australian Securities and Investments Commission to build an initial release of a First Derivatives surveillance product, said: “Market participants want a level playing field. Regulators impose rules to achieve this and they tighten them up when people try to push the envelope. So, we need market surveillance to ensure regulation is met.”

Buggy added: “From a regulatory perspective, the global desire is for increased transparency, reliability and timely price discovery. Surveillance supports these issues and it will become increasingly important as the OTC market becomes more regulated. The technology challenges are not small. To achieve a solution that is robust enough to meet regulatory requirements around price discovery, the need is for data architecture that can handle big data, reference data and risk data.”

This convergence of data suggests new applications or platforms to deal with market surveillance. Hodgkinson picked up this point, saying: “The key here is a platform that surveillance can tap into, essentially an exchange data warehouse. This brings together disparate systems and integrates them with matching engines to create a real-time data warehouse. There is one copy of each piece of data and the data can then be used for applications including surveillance, risk and regulatory compliance.”

While exchanges have traditionally used SQL databases, Hodgkinson highlighted the need for in-memory time series databases that allow patterns and trends to be identified using powerful time series technologies. Using these technologies, analytics could be extended to cover asset class surveillance including OTC derivatives, cross-market surveillance, cross-instrument surveillance and anti-money laundering.

Buggy described a framework for future regulatory support using the disciplines of data management and time stamping. He said: “Our customers want to know what is happening in the market all the time, they want to know about market pricing and orders being put on and taken off the market. They want real-time distribution and to make sure no packets are lost and that they are using the correct reference data. This traverses all we have seen so far and it means alerts need to be built into system architecture and received in real time.”

Providing a solution to this issue, Hodgkinson explained: “Behind surveillance is mathematical technology that can improve alerting. It is possible to analyse behaviour over past days and work out patterns in that behaviour. By setting high and low water marks, an alert is only triggered if something goes above or below those marks. The maths is complex, but evolving quickly and it can be used to run real-time alerting models in high frequency trading architectures.”

Touching on pre-trade predictive analysis, Buggy said: “Real-time pre-trade checks are becoming a must have. The challenge in a real-time environment is how to incorporate and manage more and more instrument data. Tools are available to do this and while regulators have not yet mandated real-time pre-trade checks, we think this is coming.”

Turning to the practicalities of building surveillance systems, Delaney asked the panel members what is required and how practitioners can best implement systems. Hodgkinson noted the data challenges of surveillance, including the need to develop a comprehensive market database as well as real-time and historical alerts that can be analysed and used to generate reports. The database needs to be integrated with external client interfaces and the complete solution needs to provide a platform that can support the volume, variety and velocity of big data.

Acton added: “The velocity and variety of data, including structured and unstructured data, are increasingly important. Nothing must be thrown away because as the data gets bigger, it becomes more valuable and its appliance can be better.”

On the detail of a surveillance system, Hodgkinson said: “The system has to be lean, fast and flexible. Alert benchmarks must be defined to identify abnormal behaviour and these need to be calibrated in real time. Alerts also need to be amended in real time so that you see only 100 or 200 and reduce false positives. When this is done, it is possible to overlay risk management or anti-money laundering solutions on the system.” Summing up the discussion, Buggy concluded: “Data architecture is as important as system architecture for market surveillance.”

Subscribe to our newsletter

Related content


Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...


Substantive Research White Paper Reveals Disparities and Sharp Increases in Wholesale Market Data Pricing

Substantive Research, a provider of research discovery and research spend analytics for the buy side, has published a White Paper detailing its latest findings on wholesale market data pricing. The report comes in anticipation of the UK’s Financial Conduct Authority’s (FCA) Wholesale Market Data Study, expected by 1 March 2024. The FCA initiated its study...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...