The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality on the Brain

The last thing I expected to hear during a panel on low latency technology deployment at this week’s Business & Technology of Low-Latency Trading (catchily dubbed BTLLT by fellow A-Teamers) conference in London was speakers talking about reference data quality. But that’s what happened. Everywhere I go people are talking about data quality.

They might not be on quite the same page, but the back office, middle and front office communities are talking about similar issues when it comes to reference data. Legal entity and instrument identification are obvious points of concern, given the regulatory focus on this space as a result of a whole host of new requirements resulting from everything from systemic risk monitoring endeavours to MiFID transaction reporting. During the low latency technology panel, speakers noted the increased visibility of data quality issues and the potential reputational damage that these can have on a business. Sybase’s business development manager for EMEA Stuart Grant, for example, indicated that one of the vendor’s clients almost executed a trade using exchange test data, but managed to catch the data issue in time.

“Putting in place data rules and checks around data quality and accuracy is vital to avoid reputational damage and regulatory fines,” he said. TraderServe’s technical director Nick Idelson noted that there are “problems with data quality in every client his firm goes in to see. “We have data quality checks in place to see how bad the situation is at the outset,” he explained.

Given that this was a room full of front office focused people, the data quality discussions were encouraging. It seems the downstream users of data are becoming much more aware of data quality issues and the importance of some level of standardisation. On this note, earlier in the day at BTLLT, Ernst & Young’s director of regulatory and risk management Anthony Kirby listed the seven points of concern that the MiFID Forum has raised with influential MEP and member of the Economics and Monetary Committee of the European Parliament, Kay Swinburne, and at least three of them directly relate to reference data standards.

The full list includes requests for more clarity from the regulatory community on: standards (including instrument identification standards); reporting formats and practices (reference data formatting included); client identification (something the industry has been talking a lot about this year); transparency requirements for non-equities; the operation of various venues and broker crossing networks; civil liability issues (related to retail investors in particular; and the question of execution only limitations.

The MiFID Forum itself has obviously discussed some of these issues at length with particular reference to transaction reporting, but it is good to see that this is not happening in isolation. The first step towards effecting a change is to raise the issue and get people talking about it within their own groups, the next is to foster some sort of industry-wide discussion, and this is where it gets trickier… We’re certainly hoping to kick off some cross-industry discussions about these issues at our upcoming Data Management for Risk, Analytics and Valuations (DMRAV – always with the acronyms) conference in New York on 17 May. Come join us.

Related content

WEBINAR

Recorded Webinar: Unifying client data to accelerate insights and deliver better experiences

Data insight is critical to all aspects of the buy-side investment process from portfolio decisions that drive core revenue and incremental revenue opportunities, to delivering unique and timely investment research, and rapidly innovating and offering new products to clients. While this is a key goal for asset managers, built on the ability to understand and...

BLOG

GoldenSource Innovates ESG Solution Based on Established Data Management Capabilities

GoldenSource has introduced ESG Impact, an ESG solution based on its established data management capabilities and designed to provide ESG data coverage, comparison, quality checks, and portfolio screening. The company is working with about 20 ESG data vendors to understand their materiality maps and use its established data model to correlate data at intake and...

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...