About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Tackle the Exorbitant Cost of Sourcing and Cleansing Market Data

Subscribe to our newsletter

The financial industry spends $28.5 billion on externally sourced market data and a further $2-3 billion cleaning it up so that is can be used internally. Why is this so difficult and costly, and what can the industry and market participants do about it?

Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), will discuss the problems of market data and some potential solutions during a keynote presentation at A-Team Group’s London Data Management Summit on Thursday 21 March 2019.

Moss suggests that in a market where products are defined by data and traded using data mechanisms, the data should be standardised – but this is not the case due to factors such as differences in securities identifiers, a lack of identifiers in some capital markets, data attributes represented differently across data sources, problems with classification, and different identifiers for market participants – the LEI, by way of example, is a global identifier that is only mandated across Europe, so adoption has been patchy outside Europe.

On top of these problems, Moss notes the fluidity of capital markets, in which products are created and dropped on a continuous basis, and the impact on equity data of a constant stream of corporate actions.

He says: “It is very difficult to maintain all the data that is required in a straightforward way across an organisation. The truth is that data standards are not precise enough in an environment where data comes from a huge number of sources. Data vendors pull some of the data together, but it is not possible for a firm to get all required data from one source, which becomes a challenge for the firm.”

As well as considering these challenges during his keynote, Moss will discuss industry initiatives and firms efforts’ to improve matters, and the potential of utilities to short circuit some of the problems and provide firms with cleansed data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...