About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Proving the Potential of Utility Solutions for Reference Data Management

Subscribe to our newsletter

Reference data utilities can improve data accuracy, completeness and timeliness, reduce financial firms’ data management costs, and support regulatory compliance – but to be successful they need to mutualise rather than outsource processes and offer flexible data consumption.

Presenting a keynote entitled ‘Mutualising Data Governance and Regulatory Compliance Through the Use of a Utilty’ at last week’s A-Team Group Data Management Summit in New York, Philippe Chambadal, CEO of SmartStream, discussed the industry need for a solution to reference data management, the potential of utilities, and the practical operation of SmartStream’s reference data utility.

He noted efforts over the past 20 to 30 years to resolve issues around reference data, but also no significant improvement in data management as a result of implementing in-house tools. He also spoke of failure to measure data quality and the implications of broken data, and quoted the sky high spending figures of a firm that spends towards a billion dollars a year on data, the vast majority of which is spent on broken trades caused by poor data. More typically, most Tier 1 banks spend $40 million to $200 million a year, with every dollar spent on data management being matched by three or four dollars spent on broken trades.

On the issue of data accuracy, Chambadal said SmartStream had never come across a client with more than 80% data accuracy, with the range usually being between 60% and 80%. This, he said, is a fundamental problem in the industry and is not being resolved with in-house solutions.

Arguing the case for change, he said: “Over the past few years, as margins have been compressed, heads of operations have been rethinking how to structure their back offices. The notion behind utilities is that there is a lot of middle- and back-office processing that shouldn’t be done in a bank. Reference data is high on the list of data that shouldn’t be processed in house. Fifty thousand firms are trying to clean the closing price of IBM every day – that is not efficient. The only way forward is data mutualisation.”

Setting out the challenges of reference data management, Chambadal noted issues of quality in data sourced from vendors and exchanges, missing data attributes in vendor data feeds designed primarily for front-office applications, and vendor delays in adding new issue information to data feeds. He said: “These problems can be solved by a utility that can see the entire universe of instruments and can manage data such as legal entity identifiers and corporate actions data using a single data model.”

Referring to SmartStream’s reference data utility, he explained: “We try to get data from as close to its source as possible. We include new issues on the day they are made, enrich data and cross-reference every field from every feed we receive. In a single process, we can generate golden copies of data on the fly and tailor output for specific consuming applications.”

Chambadal went on to discuss how a reference data utility can ease the burden and cost of regulation and support the regulatory push towards standardisation, saying: “Mutualisation provides quality data and allows the lifecycle of data to be tracked so that firms can prove to regulators how data has been modified. In terms of

standardisation, we are a standards translator as we can map to any XML schema or to regulations such as FATCA and EMIR.”

SmartStream has been operating and developing its reference data utility for about five years. The company initially looked across the market for a technology stack to power the utility, but found nothing that provided the scalability it needed. Chambadal explained: “The ability to scale the platform means we can add more clients, which means data quality gets better for everyone. As a processing agent, we do single vendor cleansing and data from different vendors is segregated all the time – we cross-reference only data codes. Everything is rules driven to provide a framework for both internal and regulatory data management requirements, and firms that ingest data from the utility into middle- and back-office applications can improve straight-through-processing rates and save 30% to 40% in data management costs.”

In conclusion, Chambadal commented: “Reference data should have been moved out of firms a long time ago. This could have been done, but we were in a bull market. Now people care and the back-office has to be retooled. The starting point has to be data, otherwise nothing will work downstream, and the benefits of a utility will be better efficiency, risk control and regulatory compliance.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Modernising Data Infrastructures: Challenges and opportunities of managing complex financial data and analytics in hybrid and multi-cloud environments

As they forge ahead with their digital transformation programs, financial institutions are finding that the internal platforms they use to manage complex data sets for trading, investment, risk, and compliance are no longer fit for purpose. The ongoing shift toward cloud hosting is forcing practitioners to manage the transition from deeply entrenched legacy platforms to...

BLOG

Alveo Expands Relationship with FactSet to Offer Data-as-a-Service

Alveo has expanded its relationship with FactSet, which added FactSet ESG content to Alveo’s data management platform, with a collaboration that combines the companies’ data and data management capabilities to provide customers with solutions that integrate FactSet content into workflows and databases. The collaboration is designed to minimise time needed to onboard new data sets...

EVENT

A-Team Innovation Briefing: Innovation in Cloud

This Innovation Briefing will explore approaches to data infrastructure transformation, technologies required and how to make sure processes are optimised to support real time data management. Hear from leading practitioners and innovative technology solution providers who will share insight into how to set up and leverage your data infrastructure to provide user access to consistent data and analytics, and companies the ability to monetise their data.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...