About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Webinar Discusses Adoption of the Utility Model for Reference Data Management

Subscribe to our newsletter

The concept of a utility model for reference data management is beginning to turn into a reality, but views diverge on exactly how a utility should be built and the business benefits it can deliver. A recent A-Team Group webinar discussed adoption of the utility model, considering not only early solutions in the market, but also how best they can be implemented by financial institutions. Andrew Delaney, chief content officer at A-Team Group, moderated the webinar and was joined by Julia Sutton, group customer data strategy at HSBC; Nick Taplin, data management services director at iGate; and Joseph Turso, vice president of product management at SmartStream.

If you missed the webinar, you can hear a full recording here.

Delaney set the scene by describing how the need for financial institutions to cut costs has led to a re-evaluation of operating models across business activities and particularly in areas where data management offers no competitive advantage. On this point, he noted that regulatory data management of reference data tends to be duplicated across the industry, leading to the emergence of the utility or shared service model that aims to help firms cut costs. Protagonists of the model, he said, also cite other benefits including improved data quality and more streamlined operations.

Delaney mentioned two utility approaches, one offering enterprise level reference data management and the other vertical line of business reference data management, such as Know Your Customer (KYC) data management, before asking the panel what progress is being made around utilities after a couple of years of talking about their potential.

A new approach to data management

Taplin answered: “Over the past two years, firms have started to emerge from the financial crisis. They are talking about finding new solutions that they didn’t previously have the money to buy. This is coupled to the need to comply with increasing regulation and is changing the industry’s approach to data management. Some vendors, including iGate, have started to deliver managed services, but whether these are utilities is up to others to decide.”

Turso said: “Tier one banks are driving the utility concept, not vendors, and utilities will only be successful if driven by big banks. The banks are getting together, talking and forming utilities as, coming out of the crisis, regulatory demands include improved data quality. At the same time, there is pressure on revenues and the need to reduce costs. Financial institutions are looking beyond the business process outsourcing model to see if they can reduce costs in a different way.”

Sutton, who is responsible for an ambitious project at HSBC that aims to bring data together from across the bank’s businesses to provide one cohesive view of each customer, said one breakthrough for banks in terms of starting to think about the utility concept is the Legal Entity Identifier (LEI), a standard entity identifier. She

explained: “The LEI could present an opportunity for banks to manage data in a different way. It has got us talking to each other and accepting that in some areas we all do the same thing, If we don’t collaborate, we won’t move forward. It took time to get through these initial steps, but we now agree that a utility can be a good thing, although there is still work to do on internal structures and compliance.”

Benefits of the utility model

Reflecting growing interest in the utility model, Delaney asked how utilities could deliver benefits beyond cost. Turso responded: “The SmartStream reference data utility performs a task once and everyone who uses the utility shares the benefits and realises cost savings. The utility process includes both centrally managed data to achieve mutualisation and unique client processing.

“Central management covers data acquisition and data onboarding, which normalises and cross-references data so that it can be matched. From a data management perspective, corporate actions, data quality and data enrichment, which are usually handled by individual financial institutions, can be done once for many institutions. Unique client processing includes the creation of a golden copy of data, flexible data distribution, a user interface to view data and client specific checks. Overall, clients are seeing 25% to 40% cost reductions and have 24×7 operational support.”

Taplin challenged SmartStream’s approach, saying: “The problem with this utility approach is that it is too focussed on how data is acquired and normalised. It is not very different to what data vendors already do and takes a top-down integration approach that doesn’t focus on how business uses data. A more structured approach around business strategy is needed. A utility should be built based on what people want out of it, rather than what people put into it. The difficulty is in making data useful.”

Commenting on this approach, Sutton said: “I agree that we need to understand how people use data and, at least, need to explain how best it can be used, but conversations around these issues are difficult as there is lots of red tape in the bank around customer privacy and it is often misused. I also agree with the principle that data is only useful when we know what it is used for, so instead of putting data in one central place for all comers, we could use the principle to drive how data is held, distributed and maintained, although I am not yet clear how we could make this work.”

Incremental implementation

Reversing to today’s understanding of utilities, Sutton added: “Most banks are coming around to thinking that the utility model is a good idea as it can provide a single source of the truth, while allowing banks to maintain ownership of the data even though it is outsourced. But, utilities have to be developed incrementally. For example, the LEI can provide one view of corporates, but we also deal with individuals. We can’t jump directly to a utility, but need to build in stages.”

Turso agreed, commenting: “A big bang approach doesn’t work. Clients need to prioritise areas where they have issues around operational quality or cost and then consider a utility, perhaps for derivatives that are a high touch point for banks.”

Taplin added: “The LEI and KYC are low hanging fruit, but the imperative, which is much more difficult, is to provide overall reference data management. We have looked at a number of banks’ processes and harmonised those that are the same to begin to develop a utility.”

Answering an audience question on data privacy and security in the utility model, Turso explained: “A utility acquires, manages and distributes data on the basis of a client’s data licensing. Client data is kept in isolation and when the client views data from different vendors it is not seen side by side. Data vendors have to be comfortable with how utilities use data and how it is licensed.”

Looking ahead

Finally, Delaney asked the panel what the utility model and its adoption might look like in the next year or so. Taplin said: “We are starting to see banks driving this and having a more coherent view of what they want. With a couple of providers and solutions in the market we will see more people moving to shared services. iGate is optimistic that we will have the critical mass to create something that is genuinely mutualised and shared across the industry within the next 18 months.”

Turso said: “SmartStream will make some large announcements that will progress the utility concept. Meantime, large banks are the early adopters of utilities, but as the model matures the buy side will look at the utility space and start to leverage what has been done on the sell side. Firms want to reduce data integration costs, yet get more data, so they will learn how to leverage the utility model.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Real world data governance – Practical strategies for data ownership

The theories of data governance and ownership are well rehearsed. Essentially, data governance includes rules and processes that make data accurate, compliant and accessible, ensuring the right users can access trusted data as and when they need it. Data ownership assigns responsibility and accountability for a specific dataset to an individual or team that can...

BLOG

Delta Capita and Montis Group Enhance Partnership to Develop Advanced Digital Securities CSD Infrastructure

Delta Capita and Montis Group have announced the expansion of their commercial partnership, building on their initial success in developing the Montis Central Securities Depositary (CSD) system. This digitally-native CSD system, powered by Delta Capita’s MACH Distributed Ledger Technology (DLT) platform, is designed to provide the market infrastructure needed for regulated tokenised assets, in the...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...