The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Regulation and Cost Drive Adoption of Data Utilities

The potential of reference data utilities to cut data management costs and improve data quality was a hot topic at last week’s A-Team Group Data Management Summit in New York. Most speakers and delegates agreed that data utilities will become part of the capital markets landscape, but exactly how they will operate and how the market will pan out was less certain.

Moderating a panel session entitled Towards a Utility Model for Reference Data, A-Team Group editor Sarah Underwood started the discussion by questioning how utilities can be defined and what they aim to deliver. Kurt Eldridge, head of sales, North America, at SmartStream, explained: “Data utilities are about mutualisation. Compared to business process outsourcing and managed services, a utility adds mutualisation and consolidation of data. For us, a utility is more than technology, it is a combination of technology and data operations. Business process outsourcers look at a 25% cost reduction in data management, but a utility needs to get closer to 50%.”

While the utility concept proposes multiple users of a centralised reference data management solution, perhaps suggesting a lack of flexibility for each user, panel members pointed out that this is not the case. Pramod Achanta, partner in Financial Markets at IBM, said: “The clients we work with are sophisticated, so a utility cannot be based on a one-size-fits-all model.” Turning to the issue of what utilities can deliver, he added: “A big part of what data utilities deliver is cost savings, but they also need to offer a high degree of data quality and be a means of enhancing best practices.”

Martijn Groot, director at Euroclear, which partners SmartStream in providing the Central Data Utility service, said: “Data quality is a real issue, so a utility needs the right tooling below the right platform. Missing something like a corporate action can have enormous repercussions. In terms of cost, there are savings in using a utility, as well as in change management, managing impending regulations and changing data feeds and products. A utility also makes flexible data sourcing and on-boarding new feeds much easier. While there is a lot of work for utility suppliers to do, we don’t see a lot of hurdles when it comes to market acceptance.”

Increasing regulatory scrutiny and the need to decrease the cost of reference data is driving financial firms towards data utilities, with data lineage being just one classic example of where a utility can provide benefit. Roy Williamson, global head of sales at Bloomberg Polarlake, explained: “Some firms just don’t have the resources to provide data lineage internally and prove to a regulator what changes have been made to specific data. This kind of data governance carried out by a utility on behalf of a firm can be very helpful.”

Adam Devine, head of product marketing at WorkFusion, added: “The legal entity identifier (LEI) is a natural fit for a reference data utility. Each newly issued LEI has an impact on reference data, so firms could use a utility to map their identification schemes to LEIs and make sure the data is kept fresh and up to date.”

On a broader basis, Eldridge said: “The cost to meet regulatory requirements using current data models is too much for many firms. This is starting to drive utilities forward as revenue pressure is urging firms to find other ways to deliver data services.”

Answering an audience question on how multiple vendor data sources can be fed into a single platform and how utility platforms will be differentiated going forward, the panel returned a variety of views. Achanta said: “There are some concerns in the market about the independence of data that comes together within a common utility. We want to provide a vendor neutral utility, allowing clients to optimise how they take in data from different sources and to switch providers if they need to.”

Agreeing that utilities will be neither black box, nor one-size-fits-all solutions, the panel considered their technology components. Williamson suggested semantic technology could underpin changes in data requirements over the next few years, Devine proposed a software-as-a-service model as the basis for utilities, and Groot highlighted the need for platforms that are able to control and track licensing within the data flow, as well as cater for client deviations.

Speaking from experience of having a utility in production for four years, SmartStream’s Eldridge said: “We take a programmatic approach to the data. On the distribution side, the need is to enrich the data, create a consolidated file for a single ass class and understand the profiles of customers. That is what a utility should provide, one file for each asset class and flexible distribution.”

Looking forward, opinions differed on how many utilities the market will accommodate. Devine argued in favour of one global reference data utility, Groot suggested there may be a couple, and Achanta predicted four or five major players offering two or three different utility models over the next few years. He said: “Different models and multiple data infrastructures are being developed in the marketplace. Some of the models will be better suited to particular types of client, so we will know in a few years which design works best. I think a simple and flexible model will be the most successful.”

Providing advice to practitioners thinking about using a utility, Groot summed up the recommendations of the panel members, concluding: “Question what you are looking for in a utility and consider underlying quality metrics. Then do your homework, decide how you want to interact with a supplier and work out what is important in terms of your own future development.”

Related content


Upcoming Webinar: Sanctions – The new pre-trade challenge for the buy-side

Date: 22 September 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Sanctions screening at the security level is a relatively recent requirement for the buy-side. It dives deeper than traditional KYC and AML screening and is immensely challenging as firms must monitor frequently changing sanctions lists, source up-to-date sanctions data...


RIMES Offers Insight into Benchmarks and Indexes with Index Identifier Service

RIMES has responded to customer requests for a simplified process to gain insight into benchmarks and indexes with RIMES Technologies Index Identifier (RTID). The service is designed to address issues around identifying and tracking the huge variety of identifiers used by index providers and front-office information platforms such as Bloomberg Tickers and Reuters RICs. RTID...


TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...