About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Philippe Chambadal Details Development of the SmartStream Reference Data Utility

Subscribe to our newsletter

After the long-awaited public announcement of the SmartStream Reference Data Utility – or SPReD, the Securities Product Reference Data utility – earlier this week, A-Team caught up with SmartStream CEO Philippe Chambadal to find out more about the utility and its ongoing development.

The utility is owned jointly by SmartStream, Goldman Sachs, JPMorgan Chase and Morgan Stanley. It is, essentially, SmartStream’s existing multi-tenanted data management service dropped into a new subsidiary company called Reference Data Services, although the utility will be operated under the brand SmartStream Reference Data Utility. It has one prime processing engine in the US and replica back-ups in Europe and at another location in the US.

The utility already has over 20 customers including banks, hedge funds, exchanges and one of the three banks that have a shareholding in the utility. The other shareholders are in the process of onboarding and Chambadal says SmartStream is talking to another five or six banks that are interested in joining the utility early next year. Most existing customers are based in the US or Europe, but firms in Singapore and Australia are expected to join up shortly. While SmartStream has the majority shareholding in the utility, Chambadal says the door remains open for more shareholders and comments: “Banks like being shareholders because they have governance of the utility and its development schedule.”

In terms of data vendors, Chambadal says the utility currently uses about 20 data sources and will announce these in the first week of November as it begins to increase the number to towards 200 over the next 18 months. He says: “In general, data vendors are embracing the utility concept and realise it will expand usage of their data. Utility clients still contract with the data vendors and we act as a third-party processing agent.”

The utility initially covers instrument data and charges fees based on the number of data sources and instruments, and the complexity of the instruments, that are processed for each customer. By the end of the first quarter of 2016, the utility will be ready to consume data sources covering counterparty and issuer data, Know Your Customer data and SSI data. Chambadal says: “The data model of the utility is ready to accept all data types and new sources can be added quickly.”

From a customer perspective, he says it takes a few weeks to onboard a firm with a couple of feeds and 100,000 instruments, and between three and six months to onboard a firm with many sources and millions of instruments. US market data at business close can be processed in 20 to 30 minutes.

Arguing the financial case for the utility and citing SmartStream’s experience as a trade reconciliation vendor, Chambadal says: “Processing vendor data once and distributing it to many customers significantly reduces the cost of data management. But by delivering complete, enriched and cross-referenced datasets, the utility also reduces the cost of trade breaks, 30% to 40% of which are caused by poor reference data, and reduces the impact of poor data on downstream systems.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results – and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and LLMs promise to tackle complexity and volume at a scale never seen before. But...

BLOG

Data Fabric vs. Data Mesh: 10 Companies Provisioning Modern Data Architectures for Enterprise AI

As institutions absorb ever greater volumes of data to meet their increasingly complex operational needs and those of regulators, they face a dilemma of how to store and distribute that critical information. Fragmented legacy systems have long been an impediment to the smooth management of data and now corralling multiple-cloud configurations can be added to...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...