About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Navigating the Evolving Landscape of Trade Surveillance: Key Insights and Best Practices

Subscribe to our newsletter

With regulators sharpening their focus and trading desks pushing ever more complex strategies, a robust trade surveillance capability has become non-negotiable for today’s financial institutions. From spoofing to layering, new market-abuse schemes demand surveillance platforms that blend real-time analytics, machine learning and human insight.

A recent webinar, hosted by A-Team Group, brought together a diverse panel of experts from the regulatory, vendor, and practitioner communities, including representatives from co-sponsors Nice Actimize and One Tick. The discussion illuminated the persistent pain points and emerging best practices in managing market abuse risk, offering valuable insights for senior practitioners navigating this dynamic field.

Surveillance Remains in Focus

Despite discussions surrounding deregulation, particularly emanating from the US, the consensus among panellists was clear: regulatory focus on market abuse and investor protection remains unwavering. As one practitioner observed, firms are “not switching off [their] surveillance and monitoring that [they] already have in place”. Instead, the prevailing approach is towards “smarter compliance”.

A former regulator echoed this, noting that while there might be a push-pull between “regulation by enforcement” and collaborative industry engagement, the “rules need to be followed and there will be consequences for firms that have significant issues”. This necessitates a “robust, living risk assessment framework” and strong internal communication across legal, compliance, business, and technology functions.

The sentiment is that surveillance is “not going anywhere”; indeed, regulators continue to prioritise “going after fraud and investor protection,” which directly translates to addressing market abuse through robust surveillance.

False Positives vs False Negatives An audience poll during the webinar highlighted that “managing alert volumes of false positives” remains the primary pain point consuming the most resources at firms. While acknowledging the operational burden, a vendor expert cautioned that an excessive focus on false positives can “distract a little bit from what could ultimately be more important” – namely, understanding false negatives (missed true positives), ensuring the quality of true positives, and gaining a holistic view of cross-product and cross-market risks. This underscores a vital distinction – reducing noise is important, but not at the expense of missing genuine misconduct. The discussion also raised the nuanced point that if the question had focused on the “biggest regulatory problem,” data completeness and provenance might have featured higher than mere alert volumes, reflecting a deeper, underlying systemic concern.

Adaptive Models and Data Quality

The panel delved into handling alert surges during volatile trading periods. While practical responses involve checking with the business and reviewing alerts retrospectively using “Mark one eyeball” and business knowledge, a more fundamental issue was raised. An explosion of alerts on busy days “suggests something wrong with the way those alerts are designed and configured”. Models should be calibrated to adapt to systematic market changes, such as increases in volume or volatility, by accounting for risk and reward dynamics. For instance, certain abuses like spoofing may naturally decrease in volatile markets due to higher risk for the perpetrator.

This leads directly to the critical role of data quality. Best practices dictate that surveillance systems should “apply certain data quality checks to all input data”, flagging issues similarly to trade practice alerts, albeit with appropriate severity. Crucially, “data quality analysis has to be upstream of trade practice alert generation” to prevent “garbage in, garbage out” and, more importantly, to address missed alerts due to poor data quality.

A practitioner shared their experience with an in-house solution, where market data processing bottlenecks during high volatility could “prevent certain order and trade-based models… from being generated,” which compliance teams would identify by an “adjusted reduction perhaps in the alert trading volumes”. This highlights that data quality issues can directly lead to failures in detection, reinforcing the need for proactive data governance.

Advanced Analytics Beyond the Numbers The adoption of AI and Machine Learning (ML) in trade surveillance is still largely in early stages, with most firms either rules-based or conducting small-scale pilots. When assessing the performance of advanced models, the conversation showcased different perspectives. From a vendor viewpoint, performance is linked to increased processing demand (indicating the model is working) and the surveillance team’s “productivity,” measured by their ability to clear the alert queue and “alert aging”. Ultimately, for vendors, the true measure is “customer satisfaction and whether the customer is spending less time or wasting less time”.

However, from a former regulator’s perspective, drawing on FINRA’s transition to deep learning models, more quantitative metrics are vital. Key amongst these are “precision” (how many flagged alerts were actually suspicious, indicating efficiency) and “recall” (how many suspicious activities were caught by the model, indicating effectiveness). High recall, for instance, means a “lower risk of missing something that was actually misconduct”. This balance between precision and recall, requiring close collaboration with data scientists, is seen as an “ongoing journey”.

Holistic Surveillance

The concept of “holistic surveillance,” correlating voice, e-comms, and trade data, was explored. While full integration is an ambitious goal, internal communication and information sharing between different surveillance groups are paramount.

As one panellist quipped, firms “can’t work like that” in a rigidly siloed fashion where an issue identified in one area is not communicated to another, even if high-level information exchange respects privacy concerns. Conversely, a practitioner noted that “siloed trade and… comms and it works for your organization, and you can support that through your MI and your KPIs and your regulatory engagement… I see no reason to push towards holistic”. This suggests that effective risk management can also be achieved through a combination of siloed and “tactical monitoring” approaches, which also serves to “upskill your compliance team” in areas like data manipulation.

Regarding cross-product manipulation, panellists agreed it “should be an expectation” from surveillance vendors, especially as such abusive behaviours are “observed daily” across various asset classes. However, this expectation is “tempered” by practical limitations. Attempting to survey the “entire universe of all financial products” for correlations is impractical given the astronomical number of potential relationships (e.g., “10 to the power of 13 correlations”). Therefore, the focus is shifting from aspirational wholesale integration to addressing areas with the “highest risk of cross-product manipulation and strong correlation potential”, where the individual has actively manipulated multiple instruments.

The Unifying Thread: Data Quality Across all discussions, a single, dominant theme emerged and was unanimously reaffirmed in the concluding remarks: “data quality, data governance and the framework around data governance still presides very high”. It remains the “lifeblood of surveillance” and a significant challenge yet to be fully solved for the industry. Key recommendations included treating data quality issues with the same seriousness as compliance issues, ensuring models are transparent and self-calibrating, actively engaging with model risk management teams, and maintaining robust internal and external communication across all stakeholders. These elements are crucial for enabling firms to mature their surveillance capabilities and effectively leverage emerging technologies in the complex capital markets landscape.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Hearing from the Experts: AI Governance Best Practices

9 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical...

BLOG

The Top Seven Trade Infrastructure Monitoring Solutions in 2025

Trade infrastructure monitoring – the real-time surveillance and analytics of trading systems, networks, and data flows to ensure performance, stability, and compliance – is critical for banks, asset managers, hedge funds, exchanges, and trading technology providers to maintain low-latency execution, optimise system resilience, and detect anomalies before they disrupt trading operations. Across the trade infrastructure...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...