About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Philippe Chambadal Details Development of the SmartStream Reference Data Utility

Subscribe to our newsletter

After the long-awaited public announcement of the SmartStream Reference Data Utility – or SPReD, the Securities Product Reference Data utility – earlier this week, A-Team caught up with SmartStream CEO Philippe Chambadal to find out more about the utility and its ongoing development.

The utility is owned jointly by SmartStream, Goldman Sachs, JPMorgan Chase and Morgan Stanley. It is, essentially, SmartStream’s existing multi-tenanted data management service dropped into a new subsidiary company called Reference Data Services, although the utility will be operated under the brand SmartStream Reference Data Utility. It has one prime processing engine in the US and replica back-ups in Europe and at another location in the US.

The utility already has over 20 customers including banks, hedge funds, exchanges and one of the three banks that have a shareholding in the utility. The other shareholders are in the process of onboarding and Chambadal says SmartStream is talking to another five or six banks that are interested in joining the utility early next year. Most existing customers are based in the US or Europe, but firms in Singapore and Australia are expected to join up shortly. While SmartStream has the majority shareholding in the utility, Chambadal says the door remains open for more shareholders and comments: “Banks like being shareholders because they have governance of the utility and its development schedule.”

In terms of data vendors, Chambadal says the utility currently uses about 20 data sources and will announce these in the first week of November as it begins to increase the number to towards 200 over the next 18 months. He says: “In general, data vendors are embracing the utility concept and realise it will expand usage of their data. Utility clients still contract with the data vendors and we act as a third-party processing agent.”

The utility initially covers instrument data and charges fees based on the number of data sources and instruments, and the complexity of the instruments, that are processed for each customer. By the end of the first quarter of 2016, the utility will be ready to consume data sources covering counterparty and issuer data, Know Your Customer data and SSI data. Chambadal says: “The data model of the utility is ready to accept all data types and new sources can be added quickly.”

From a customer perspective, he says it takes a few weeks to onboard a firm with a couple of feeds and 100,000 instruments, and between three and six months to onboard a firm with many sources and millions of instruments. US market data at business close can be processed in 20 to 30 minutes.

Arguing the financial case for the utility and citing SmartStream’s experience as a trade reconciliation vendor, Chambadal says: “Processing vendor data once and distributing it to many customers significantly reduces the cost of data management. But by delivering complete, enriched and cross-referenced datasets, the utility also reduces the cost of trade breaks, 30% to 40% of which are caused by poor reference data, and reduces the impact of poor data on downstream systems.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Uncovering Data Anomalies: 16 Data Observability Solutions for Capital Markets

Financial institutions’ operational resilience depends largely on the integrity of their data and the applications it feeds. The huge volume of data that modern organisations ingest makes this a challenge. The accuracy, completeness and timeliness of critical data can be improved if it is monitored and checked as it moves through increasingly intricate data pipelines...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...