About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality on the Brain

Subscribe to our newsletter

The last thing I expected to hear during a panel on low latency technology deployment at this week’s Business & Technology of Low-Latency Trading (catchily dubbed BTLLT by fellow A-Teamers) conference in London was speakers talking about reference data quality. But that’s what happened. Everywhere I go people are talking about data quality.

They might not be on quite the same page, but the back office, middle and front office communities are talking about similar issues when it comes to reference data. Legal entity and instrument identification are obvious points of concern, given the regulatory focus on this space as a result of a whole host of new requirements resulting from everything from systemic risk monitoring endeavours to MiFID transaction reporting. During the low latency technology panel, speakers noted the increased visibility of data quality issues and the potential reputational damage that these can have on a business. Sybase’s business development manager for EMEA Stuart Grant, for example, indicated that one of the vendor’s clients almost executed a trade using exchange test data, but managed to catch the data issue in time.

“Putting in place data rules and checks around data quality and accuracy is vital to avoid reputational damage and regulatory fines,” he said. TraderServe’s technical director Nick Idelson noted that there are “problems with data quality in every client his firm goes in to see. “We have data quality checks in place to see how bad the situation is at the outset,” he explained.

Given that this was a room full of front office focused people, the data quality discussions were encouraging. It seems the downstream users of data are becoming much more aware of data quality issues and the importance of some level of standardisation. On this note, earlier in the day at BTLLT, Ernst & Young’s director of regulatory and risk management Anthony Kirby listed the seven points of concern that the MiFID Forum has raised with influential MEP and member of the Economics and Monetary Committee of the European Parliament, Kay Swinburne, and at least three of them directly relate to reference data standards.

The full list includes requests for more clarity from the regulatory community on: standards (including instrument identification standards); reporting formats and practices (reference data formatting included); client identification (something the industry has been talking a lot about this year); transparency requirements for non-equities; the operation of various venues and broker crossing networks; civil liability issues (related to retail investors in particular; and the question of execution only limitations.

The MiFID Forum itself has obviously discussed some of these issues at length with particular reference to transaction reporting, but it is good to see that this is not happening in isolation. The first step towards effecting a change is to raise the issue and get people talking about it within their own groups, the next is to foster some sort of industry-wide discussion, and this is where it gets trickier… We’re certainly hoping to kick off some cross-industry discussions about these issues at our upcoming Data Management for Risk, Analytics and Valuations (DMRAV – always with the acronyms) conference in New York on 17 May. Come join us.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of this transition are improved operational efficiency as manual processes are replaced by faster, more accurate automated...

BLOG

NeoXam Sets Sights on Narrowing Private Data Gap Between GPs and LPs

As demand for private markets data accelerates, asset allocators are finding themselves having to play digital catch up with their investor counterparts. General partners (GPs), who manage private funds and allocate capital invested by limited partners (LPs) have found themselves technologically behind the curve as institutional investors plough into the once-niche markets. But because LPs are...

EVENT

TEST Event page 2

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Tackling the Data Management Challenges of FATCA

As the July 1, 2014 deadline for compliance with the Foreign Account Tax Compliance Act – or FATCA – approaches, financial institutions around the world are working to ensure their data management and operational systems will meet the requirements of the US legislation. This report discusses the requirements of FATCA and how the legislation is...