About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Proving the Potential of Utility Solutions for Reference Data Management

Subscribe to our newsletter

Reference data utilities can improve data accuracy, completeness and timeliness, reduce financial firms’ data management costs, and support regulatory compliance – but to be successful they need to mutualise rather than outsource processes and offer flexible data consumption.

Presenting a keynote entitled ‘Mutualising Data Governance and Regulatory Compliance Through the Use of a Utilty’ at last week’s A-Team Group Data Management Summit in New York, Philippe Chambadal, CEO of SmartStream, discussed the industry need for a solution to reference data management, the potential of utilities, and the practical operation of SmartStream’s reference data utility.

He noted efforts over the past 20 to 30 years to resolve issues around reference data, but also no significant improvement in data management as a result of implementing in-house tools. He also spoke of failure to measure data quality and the implications of broken data, and quoted the sky high spending figures of a firm that spends towards a billion dollars a year on data, the vast majority of which is spent on broken trades caused by poor data. More typically, most Tier 1 banks spend $40 million to $200 million a year, with every dollar spent on data management being matched by three or four dollars spent on broken trades.

On the issue of data accuracy, Chambadal said SmartStream had never come across a client with more than 80% data accuracy, with the range usually being between 60% and 80%. This, he said, is a fundamental problem in the industry and is not being resolved with in-house solutions.

Arguing the case for change, he said: “Over the past few years, as margins have been compressed, heads of operations have been rethinking how to structure their back offices. The notion behind utilities is that there is a lot of middle- and back-office processing that shouldn’t be done in a bank. Reference data is high on the list of data that shouldn’t be processed in house. Fifty thousand firms are trying to clean the closing price of IBM every day – that is not efficient. The only way forward is data mutualisation.”

Setting out the challenges of reference data management, Chambadal noted issues of quality in data sourced from vendors and exchanges, missing data attributes in vendor data feeds designed primarily for front-office applications, and vendor delays in adding new issue information to data feeds. He said: “These problems can be solved by a utility that can see the entire universe of instruments and can manage data such as legal entity identifiers and corporate actions data using a single data model.”

Referring to SmartStream’s reference data utility, he explained: “We try to get data from as close to its source as possible. We include new issues on the day they are made, enrich data and cross-reference every field from every feed we receive. In a single process, we can generate golden copies of data on the fly and tailor output for specific consuming applications.”

Chambadal went on to discuss how a reference data utility can ease the burden and cost of regulation and support the regulatory push towards standardisation, saying: “Mutualisation provides quality data and allows the lifecycle of data to be tracked so that firms can prove to regulators how data has been modified. In terms of

standardisation, we are a standards translator as we can map to any XML schema or to regulations such as FATCA and EMIR.”

SmartStream has been operating and developing its reference data utility for about five years. The company initially looked across the market for a technology stack to power the utility, but found nothing that provided the scalability it needed. Chambadal explained: “The ability to scale the platform means we can add more clients, which means data quality gets better for everyone. As a processing agent, we do single vendor cleansing and data from different vendors is segregated all the time – we cross-reference only data codes. Everything is rules driven to provide a framework for both internal and regulatory data management requirements, and firms that ingest data from the utility into middle- and back-office applications can improve straight-through-processing rates and save 30% to 40% in data management costs.”

In conclusion, Chambadal commented: “Reference data should have been moved out of firms a long time ago. This could have been done, but we were in a bull market. Now people care and the back-office has to be retooled. The starting point has to be data, otherwise nothing will work downstream, and the benefits of a utility will be better efficiency, risk control and regulatory compliance.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for buy-side data management across structured and unstructured data

Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step up customer acquisition and compliance, and ultimately, gain competitive advantage in a market characterised by tight...

BLOG

Upcoming Webinar: Detecting and Preventing Market Abuse

In an era of record-breaking fines and global regulators relentlessly expanding their oversight, preventing market abuse has become a major compliance challenge for financial institutions. In 2024 alone, we witnessed a major trading house settle for $55 million for improper handling of inside information and manipulative trading practices, while a global banking giant faced a...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...