About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Arguing the Case for Reference Data Utilities

Subscribe to our newsletter

Reference data utilities offer financial firms cost effective and flexible data management, improved data quality and the potential to standardise enterprise data. Arguing the case for data utilities as the only way forward for data management, Adam Cottingham, vice president of data management services at SmartStream, detailed their benefits in a keynote presentation at last week’s A-Team Group Data Management Summit.

Setting out the need for a new approach to data management, Cottingham described the large and complex nature of today’s data environments, the problem of data errors being propagated throughout a firm, the high cost of fixing data errors and the attribution of 50% of trade breaks to poor data quality. These problems, coupled to the fact that many firms are carrying out the same operations on the same reference data, suggest, Cottingham said, the need to move towards data utilities that not only ease the problems, but also support the regulatory burden, deliver cost reductions, improve operational control and standardise enterprise data.

The business case for data utilities includes efficient delivery of data management in a cost effective way, a focus on processing data from agnostic sources, and provision of improved data quality. Cottingham explained: “The parameters of a data management business case show that utilities can do more for a firm than just direct data processing at a reduced cost. The utility approach can also improve data operations, provide downstream remediation and support risk and financial reporting.”

Cottingham went on to describe principles of effective data management that can be encapsulated in reference data utilities. These include recognition of data as a corporate asset, data standardisation upstream, promotion of a common data definition, provision of business links and controls, multiple orientations on a single view of the truth and, last but not least, the ability to trace data changes.

Turning to data governance, Cottingham noted the need to cover the complete data management landscape and make data changes in line with market events such as corporate actions. Detailing the importance of flexibility and standardisation in the development and use of reference data utilities, he concluded: “It should be relatively easy to move towards a data utility approach to data management as utilities provide data standardisation and flexibility, giving users the data they want and the ability to slice and dice it to meet their needs. Utilities also build best practice, continually improve data quality and accelerate data delivery. For heavy data users, mutualised reference data services provide significant potential and the opportunity to move towards industrialising data processing.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Enforcement Targeting Weak Control Design – The GRC Fault Line

Recent enforcement actions point to sharpened regulatory expectations for evidence of controls capable of preventing, detecting, escalating, and correcting risk. Where firms are falling short, enforcement is landing on design, governance, and oversight failures. Across trading surveillance, client onboarding and valuation governance, regulators are drawing the same distinction between having controls on paper and having...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...