About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality on the Brain

Subscribe to our newsletter

The last thing I expected to hear during a panel on low latency technology deployment at this week’s Business & Technology of Low-Latency Trading (catchily dubbed BTLLT by fellow A-Teamers) conference in London was speakers talking about reference data quality. But that’s what happened. Everywhere I go people are talking about data quality.

They might not be on quite the same page, but the back office, middle and front office communities are talking about similar issues when it comes to reference data. Legal entity and instrument identification are obvious points of concern, given the regulatory focus on this space as a result of a whole host of new requirements resulting from everything from systemic risk monitoring endeavours to MiFID transaction reporting. During the low latency technology panel, speakers noted the increased visibility of data quality issues and the potential reputational damage that these can have on a business. Sybase’s business development manager for EMEA Stuart Grant, for example, indicated that one of the vendor’s clients almost executed a trade using exchange test data, but managed to catch the data issue in time.

“Putting in place data rules and checks around data quality and accuracy is vital to avoid reputational damage and regulatory fines,” he said. TraderServe’s technical director Nick Idelson noted that there are “problems with data quality in every client his firm goes in to see. “We have data quality checks in place to see how bad the situation is at the outset,” he explained.

Given that this was a room full of front office focused people, the data quality discussions were encouraging. It seems the downstream users of data are becoming much more aware of data quality issues and the importance of some level of standardisation. On this note, earlier in the day at BTLLT, Ernst & Young’s director of regulatory and risk management Anthony Kirby listed the seven points of concern that the MiFID Forum has raised with influential MEP and member of the Economics and Monetary Committee of the European Parliament, Kay Swinburne, and at least three of them directly relate to reference data standards.

The full list includes requests for more clarity from the regulatory community on: standards (including instrument identification standards); reporting formats and practices (reference data formatting included); client identification (something the industry has been talking a lot about this year); transparency requirements for non-equities; the operation of various venues and broker crossing networks; civil liability issues (related to retail investors in particular; and the question of execution only limitations.

The MiFID Forum itself has obviously discussed some of these issues at length with particular reference to transaction reporting, but it is good to see that this is not happening in isolation. The first step towards effecting a change is to raise the issue and get people talking about it within their own groups, the next is to foster some sort of industry-wide discussion, and this is where it gets trickier… We’re certainly hoping to kick off some cross-industry discussions about these issues at our upcoming Data Management for Risk, Analytics and Valuations (DMRAV – always with the acronyms) conference in New York on 17 May. Come join us.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Theta Lake Touts First-of-its-Kind ISO Certification for AI Comms Data Trust

Data security specialist Theta Lake has been awarded trust certification for its artificial intelligence-powered compliance communications services. The designation was conferred as the company prepares to release a report that shows IT teams in financial services and other industries are facing challenges with their AI governance and security. Santa Barbara, California-based Theta Lake achieved ISO...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...