About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management – Why is it all so Difficult and Costly?

Subscribe to our newsletter

The financial industry spends $28.5 billion on externally sourced market data and a further $2-3 billion cleaning it up so that is can be used internally. Why is this so difficult and costly, and what can the industry and market participants do about it?

Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), will discuss the problems of market data and some potential solutions during a keynote presentation at A-Team Group’s Data Management Summit in New York City next week on Thursday 19 September.

Moss suggests that in a market where products are defined by data and traded using data mechanisms, the data should be standardised – but this is not the case due to factors such as differences in securities identifiers, a lack of identifiers in some capital markets, data attributes represented differently across data sources, problems with classification, and different identifiers for market participants.

On top of these problems, Moss notes the fluidity of capital markets, in which products are created and dropped on a continuous basis, and the impact on equity data of a constant stream of corporate actions.

He says: “It is very difficult to maintain all the data that is required in a straightforward way across an organisation. The truth is that data standards are not precise enough in an environment where data comes from a huge number of sources. Data vendors pull some of the data together, but it is not possible for a firm to get all required data from one source, which becomes a challenge for the firm.”

As well as considering these challenges during his keynote, Moss will discuss industry initiatives and firms’ efforts to improve matters, and the potential of utilities to short circuit some of the problems and provide firms with cleansed data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

Data Quality Posing Obstacles to AI Adoption and Other Processes, say Reports

The rush to build artificial intelligence applications has hit a wall of poor quality data and data complexity that’s hindering them from taking advantage of the technology. Those barriers are also preventing firms from upgrading other parts of their tech stacks. A slew of surveys and comments by researchers and vendors paint a picture of...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...