About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Proving the Potential of Utility Solutions for Reference Data Management

Subscribe to our newsletter

Reference data utilities can improve data accuracy, completeness and timeliness, reduce financial firms’ data management costs, and support regulatory compliance – but to be successful they need to mutualise rather than outsource processes and offer flexible data consumption.

Presenting a keynote entitled ‘Mutualising Data Governance and Regulatory Compliance Through the Use of a Utilty’ at last week’s A-Team Group Data Management Summit in New York, Philippe Chambadal, CEO of SmartStream, discussed the industry need for a solution to reference data management, the potential of utilities, and the practical operation of SmartStream’s reference data utility.

He noted efforts over the past 20 to 30 years to resolve issues around reference data, but also no significant improvement in data management as a result of implementing in-house tools. He also spoke of failure to measure data quality and the implications of broken data, and quoted the sky high spending figures of a firm that spends towards a billion dollars a year on data, the vast majority of which is spent on broken trades caused by poor data. More typically, most Tier 1 banks spend $40 million to $200 million a year, with every dollar spent on data management being matched by three or four dollars spent on broken trades.

On the issue of data accuracy, Chambadal said SmartStream had never come across a client with more than 80% data accuracy, with the range usually being between 60% and 80%. This, he said, is a fundamental problem in the industry and is not being resolved with in-house solutions.

Arguing the case for change, he said: “Over the past few years, as margins have been compressed, heads of operations have been rethinking how to structure their back offices. The notion behind utilities is that there is a lot of middle- and back-office processing that shouldn’t be done in a bank. Reference data is high on the list of data that shouldn’t be processed in house. Fifty thousand firms are trying to clean the closing price of IBM every day – that is not efficient. The only way forward is data mutualisation.”

Setting out the challenges of reference data management, Chambadal noted issues of quality in data sourced from vendors and exchanges, missing data attributes in vendor data feeds designed primarily for front-office applications, and vendor delays in adding new issue information to data feeds. He said: “These problems can be solved by a utility that can see the entire universe of instruments and can manage data such as legal entity identifiers and corporate actions data using a single data model.”

Referring to SmartStream’s reference data utility, he explained: “We try to get data from as close to its source as possible. We include new issues on the day they are made, enrich data and cross-reference every field from every feed we receive. In a single process, we can generate golden copies of data on the fly and tailor output for specific consuming applications.”

Chambadal went on to discuss how a reference data utility can ease the burden and cost of regulation and support the regulatory push towards standardisation, saying: “Mutualisation provides quality data and allows the lifecycle of data to be tracked so that firms can prove to regulators how data has been modified. In terms of

standardisation, we are a standards translator as we can map to any XML schema or to regulations such as FATCA and EMIR.”

SmartStream has been operating and developing its reference data utility for about five years. The company initially looked across the market for a technology stack to power the utility, but found nothing that provided the scalability it needed. Chambadal explained: “The ability to scale the platform means we can add more clients, which means data quality gets better for everyone. As a processing agent, we do single vendor cleansing and data from different vendors is segregated all the time – we cross-reference only data codes. Everything is rules driven to provide a framework for both internal and regulatory data management requirements, and firms that ingest data from the utility into middle- and back-office applications can improve straight-through-processing rates and save 30% to 40% in data management costs.”

In conclusion, Chambadal commented: “Reference data should have been moved out of firms a long time ago. This could have been done, but we were in a bull market. Now people care and the back-office has to be retooled. The starting point has to be data, otherwise nothing will work downstream, and the benefits of a utility will be better efficiency, risk control and regulatory compliance.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

Moody’s Gen AI Research Assistant Offers New Insights from Research, Data and Analytics

Moody’s has released Moody’s Research Assistant, a search and analytical tool powered by generative AI and using the company’s proprietary content and the latest large language models (LLMs) to help customers generate new insights from its credit research, data, and analytics. The research assistant synthesises vast amounts of information allowing users to assess lending or...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Data Lineage Handbook 2019

Welcome to our latest handbook on data lineage, a critical concern for data managers working to achieve regulatory compliance, deliver operational gains, and provide meaningful value to the business. The handbook covers the complete scope of data lineage, with a view to helping you win management buy-in and budget, decide whether to build or buy...