About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management – Why is it all so Difficult and Costly?

Subscribe to our newsletter

The financial industry spends $28.5 billion on externally sourced market data and a further $2-3 billion cleaning it up so that is can be used internally. Why is this so difficult and costly, and what can the industry and market participants do about it?

Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), will discuss the problems of market data and some potential solutions during a keynote presentation at A-Team Group’s Data Management Summit in New York City next week on Thursday 19 September.

Moss suggests that in a market where products are defined by data and traded using data mechanisms, the data should be standardised – but this is not the case due to factors such as differences in securities identifiers, a lack of identifiers in some capital markets, data attributes represented differently across data sources, problems with classification, and different identifiers for market participants.

On top of these problems, Moss notes the fluidity of capital markets, in which products are created and dropped on a continuous basis, and the impact on equity data of a constant stream of corporate actions.

He says: “It is very difficult to maintain all the data that is required in a straightforward way across an organisation. The truth is that data standards are not precise enough in an environment where data comes from a huge number of sources. Data vendors pull some of the data together, but it is not possible for a firm to get all required data from one source, which becomes a challenge for the firm.”

As well as considering these challenges during his keynote, Moss will discuss industry initiatives and firms’ efforts to improve matters, and the potential of utilities to short circuit some of the problems and provide firms with cleansed data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best Practices for Building High-Performance Data Infrastructures

The requirement for high-performance data systems to support trading analytics for hedge funds, high-frequency trading firms and electronic liquidity providers is well established. But the explosion in Big Data over the past several years has expanded the scope of inputs being used by these firms. At the same time, cloud technologies have added complexity to...

BLOG

FactSet Collaborates with CID to Strengthen AI Capabilities for Clients

FactSet has teamed up with AI software specialist CID to build a joint data lake for the financial services industry. The data lake merges unstructured public web content with FactSet’s structured content sets, along with optional third-party and client content, into vast entity relationship graphs. The aim of the multi-year collaboration is to provide FactSet...

EVENT

Data Management Summit USA Virtual (Redirected)

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...