The financial industry spends $28.5 billion on externally sourced market data and a further $2-3 billion cleaning it up so that is can be used internally. Why is this so difficult and costly, and what can the industry and market participants do about it?
Peter Moss, CEO of the SmartStream Reference Data Utility (RDU), will discuss the problems of market data and some potential solutions during a keynote presentation at A-Team Group’s Data Management Summit in New York City next week on Thursday 19 September.
Moss suggests that in a market where products are defined by data and traded using data mechanisms, the data should be standardised – but this is not the case due to factors such as differences in securities identifiers, a lack of identifiers in some capital markets, data attributes represented differently across data sources, problems with classification, and different identifiers for market participants.
On top of these problems, Moss notes the fluidity of capital markets, in which products are created and dropped on a continuous basis, and the impact on equity data of a constant stream of corporate actions.
He says: “It is very difficult to maintain all the data that is required in a straightforward way across an organisation. The truth is that data standards are not precise enough in an environment where data comes from a huge number of sources. Data vendors pull some of the data together, but it is not possible for a firm to get all required data from one source, which becomes a challenge for the firm.”
As well as considering these challenges during his keynote, Moss will discuss industry initiatives and firms’ efforts to improve matters, and the potential of utilities to short circuit some of the problems and provide firms with cleansed data.
Subscribe to our newsletter