The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Tackle the Exorbitant Cost of Sourcing and Cleansing Market Data

The financial industry spends $28.5 billion on externally sourced market data and a further $2-3 billion cleaning it up so that is can be used internally. Why is this so difficult and costly, and what can the industry and market participants do about it?

Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), will discuss the problems of market data and some potential solutions during a keynote presentation at A-Team Group’s London Data Management Summit on Thursday 21 March 2019.

Moss suggests that in a market where products are defined by data and traded using data mechanisms, the data should be standardised – but this is not the case due to factors such as differences in securities identifiers, a lack of identifiers in some capital markets, data attributes represented differently across data sources, problems with classification, and different identifiers for market participants – the LEI, by way of example, is a global identifier that is only mandated across Europe, so adoption has been patchy outside Europe.

On top of these problems, Moss notes the fluidity of capital markets, in which products are created and dropped on a continuous basis, and the impact on equity data of a constant stream of corporate actions.

He says: “It is very difficult to maintain all the data that is required in a straightforward way across an organisation. The truth is that data standards are not precise enough in an environment where data comes from a huge number of sources. Data vendors pull some of the data together, but it is not possible for a firm to get all required data from one source, which becomes a challenge for the firm.”

As well as considering these challenges during his keynote, Moss will discuss industry initiatives and firms efforts’ to improve matters, and the potential of utilities to short circuit some of the problems and provide firms with cleansed data.

Related content

WEBINAR

Recorded Webinar: Leveraging data lineage to deliver tangible business benefits

Data lineage is central to data quality, accuracy and access. It is also essential to understanding your data and data flows, systems from simple applications to multiple business intelligence solutions, and how people in your organisation are using data. Implemented across the enterprise, data lineage can provide significant business benefits, including new business opportunities, better...

BLOG

A-Team Group Data Management Summit Virtual Hit the Hot Topics of Data Strategy, Regulatory Reporting, KYC and Onboarding

A-Team Group’s Data Management Summit Virtual, hosted last week by president and chief content officer Andrew Delaney, got off to a great start with a day one keynote by Deborah Lorenzen, managing director, head of enterprise data governance at State Street. Lorenzen was followed by panel sessions focused on using data strategy to drive data...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...