About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Proving the Potential of Utility Solutions for Reference Data Management

Subscribe to our newsletter

Reference data utilities can improve data accuracy, completeness and timeliness, reduce financial firms’ data management costs, and support regulatory compliance – but to be successful they need to mutualise rather than outsource processes and offer flexible data consumption.

Presenting a keynote entitled ‘Mutualising Data Governance and Regulatory Compliance Through the Use of a Utilty’ at last week’s A-Team Group Data Management Summit in New York, Philippe Chambadal, CEO of SmartStream, discussed the industry need for a solution to reference data management, the potential of utilities, and the practical operation of SmartStream’s reference data utility.

He noted efforts over the past 20 to 30 years to resolve issues around reference data, but also no significant improvement in data management as a result of implementing in-house tools. He also spoke of failure to measure data quality and the implications of broken data, and quoted the sky high spending figures of a firm that spends towards a billion dollars a year on data, the vast majority of which is spent on broken trades caused by poor data. More typically, most Tier 1 banks spend $40 million to $200 million a year, with every dollar spent on data management being matched by three or four dollars spent on broken trades.

On the issue of data accuracy, Chambadal said SmartStream had never come across a client with more than 80% data accuracy, with the range usually being between 60% and 80%. This, he said, is a fundamental problem in the industry and is not being resolved with in-house solutions.

Arguing the case for change, he said: “Over the past few years, as margins have been compressed, heads of operations have been rethinking how to structure their back offices. The notion behind utilities is that there is a lot of middle- and back-office processing that shouldn’t be done in a bank. Reference data is high on the list of data that shouldn’t be processed in house. Fifty thousand firms are trying to clean the closing price of IBM every day – that is not efficient. The only way forward is data mutualisation.”

Setting out the challenges of reference data management, Chambadal noted issues of quality in data sourced from vendors and exchanges, missing data attributes in vendor data feeds designed primarily for front-office applications, and vendor delays in adding new issue information to data feeds. He said: “These problems can be solved by a utility that can see the entire universe of instruments and can manage data such as legal entity identifiers and corporate actions data using a single data model.”

Referring to SmartStream’s reference data utility, he explained: “We try to get data from as close to its source as possible. We include new issues on the day they are made, enrich data and cross-reference every field from every feed we receive. In a single process, we can generate golden copies of data on the fly and tailor output for specific consuming applications.”

Chambadal went on to discuss how a reference data utility can ease the burden and cost of regulation and support the regulatory push towards standardisation, saying: “Mutualisation provides quality data and allows the lifecycle of data to be tracked so that firms can prove to regulators how data has been modified. In terms of

standardisation, we are a standards translator as we can map to any XML schema or to regulations such as FATCA and EMIR.”

SmartStream has been operating and developing its reference data utility for about five years. The company initially looked across the market for a technology stack to power the utility, but found nothing that provided the scalability it needed. Chambadal explained: “The ability to scale the platform means we can add more clients, which means data quality gets better for everyone. As a processing agent, we do single vendor cleansing and data from different vendors is segregated all the time – we cross-reference only data codes. Everything is rules driven to provide a framework for both internal and regulatory data management requirements, and firms that ingest data from the utility into middle- and back-office applications can improve straight-through-processing rates and save 30% to 40% in data management costs.”

In conclusion, Chambadal commented: “Reference data should have been moved out of firms a long time ago. This could have been done, but we were in a bull market. Now people care and the back-office has to be retooled. The starting point has to be data, otherwise nothing will work downstream, and the benefits of a utility will be better efficiency, risk control and regulatory compliance.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Mastering Data Lineage for Risk, Compliance, and AI Governance

18 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are under increasing pressure to ensure data transparency, regulatory compliance, and AI governance. Yet many struggle with fragmented data landscapes, poor lineage tracking and compliance gaps. This webinar will explore how enterprise-grade data lineage can help capital markets participants...

BLOG

A (Free) Practical AI Handbook for Capital Markets Professionals

Artificial Intelligence (AI) has swiftly transitioned from a promising concept into an operational reality across the capital markets. Senior executives, compliance leaders, and technology specialists are already well-acquainted with the potential of AI to streamline processes, enhance decision-making, and open new competitive opportunities. Yet, the current challenge isn’t about grasping AI’s transformative potential – it’s...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...