The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Webinar Discusses Adoption of the Utility Model for Reference Data Management

The concept of a utility model for reference data management is beginning to turn into a reality, but views diverge on exactly how a utility should be built and the business benefits it can deliver. A recent A-Team Group webinar discussed adoption of the utility model, considering not only early solutions in the market, but also how best they can be implemented by financial institutions. Andrew Delaney, chief content officer at A-Team Group, moderated the webinar and was joined by Julia Sutton, group customer data strategy at HSBC; Nick Taplin, data management services director at iGate; and Joseph Turso, vice president of product management at SmartStream.

If you missed the webinar, you can hear a full recording here.

Delaney set the scene by describing how the need for financial institutions to cut costs has led to a re-evaluation of operating models across business activities and particularly in areas where data management offers no competitive advantage. On this point, he noted that regulatory data management of reference data tends to be duplicated across the industry, leading to the emergence of the utility or shared service model that aims to help firms cut costs. Protagonists of the model, he said, also cite other benefits including improved data quality and more streamlined operations.

Delaney mentioned two utility approaches, one offering enterprise level reference data management and the other vertical line of business reference data management, such as Know Your Customer (KYC) data management, before asking the panel what progress is being made around utilities after a couple of years of talking about their potential.

A new approach to data management

Taplin answered: “Over the past two years, firms have started to emerge from the financial crisis. They are talking about finding new solutions that they didn’t previously have the money to buy. This is coupled to the need to comply with increasing regulation and is changing the industry’s approach to data management. Some vendors, including iGate, have started to deliver managed services, but whether these are utilities is up to others to decide.”

Turso said: “Tier one banks are driving the utility concept, not vendors, and utilities will only be successful if driven by big banks. The banks are getting together, talking and forming utilities as, coming out of the crisis, regulatory demands include improved data quality. At the same time, there is pressure on revenues and the need to reduce costs. Financial institutions are looking beyond the business process outsourcing model to see if they can reduce costs in a different way.”

Sutton, who is responsible for an ambitious project at HSBC that aims to bring data together from across the bank’s businesses to provide one cohesive view of each customer, said one breakthrough for banks in terms of starting to think about the utility concept is the Legal Entity Identifier (LEI), a standard entity identifier. She

explained: “The LEI could present an opportunity for banks to manage data in a different way. It has got us talking to each other and accepting that in some areas we all do the same thing, If we don’t collaborate, we won’t move forward. It took time to get through these initial steps, but we now agree that a utility can be a good thing, although there is still work to do on internal structures and compliance.”

Benefits of the utility model

Reflecting growing interest in the utility model, Delaney asked how utilities could deliver benefits beyond cost. Turso responded: “The SmartStream reference data utility performs a task once and everyone who uses the utility shares the benefits and realises cost savings. The utility process includes both centrally managed data to achieve mutualisation and unique client processing.

“Central management covers data acquisition and data onboarding, which normalises and cross-references data so that it can be matched. From a data management perspective, corporate actions, data quality and data enrichment, which are usually handled by individual financial institutions, can be done once for many institutions. Unique client processing includes the creation of a golden copy of data, flexible data distribution, a user interface to view data and client specific checks. Overall, clients are seeing 25% to 40% cost reductions and have 24×7 operational support.”

Taplin challenged SmartStream’s approach, saying: “The problem with this utility approach is that it is too focussed on how data is acquired and normalised. It is not very different to what data vendors already do and takes a top-down integration approach that doesn’t focus on how business uses data. A more structured approach around business strategy is needed. A utility should be built based on what people want out of it, rather than what people put into it. The difficulty is in making data useful.”

Commenting on this approach, Sutton said: “I agree that we need to understand how people use data and, at least, need to explain how best it can be used, but conversations around these issues are difficult as there is lots of red tape in the bank around customer privacy and it is often misused. I also agree with the principle that data is only useful when we know what it is used for, so instead of putting data in one central place for all comers, we could use the principle to drive how data is held, distributed and maintained, although I am not yet clear how we could make this work.”

Incremental implementation

Reversing to today’s understanding of utilities, Sutton added: “Most banks are coming around to thinking that the utility model is a good idea as it can provide a single source of the truth, while allowing banks to maintain ownership of the data even though it is outsourced. But, utilities have to be developed incrementally. For example, the LEI can provide one view of corporates, but we also deal with individuals. We can’t jump directly to a utility, but need to build in stages.”

Turso agreed, commenting: “A big bang approach doesn’t work. Clients need to prioritise areas where they have issues around operational quality or cost and then consider a utility, perhaps for derivatives that are a high touch point for banks.”

Taplin added: “The LEI and KYC are low hanging fruit, but the imperative, which is much more difficult, is to provide overall reference data management. We have looked at a number of banks’ processes and harmonised those that are the same to begin to develop a utility.”

Answering an audience question on data privacy and security in the utility model, Turso explained: “A utility acquires, manages and distributes data on the basis of a client’s data licensing. Client data is kept in isolation and when the client views data from different vendors it is not seen side by side. Data vendors have to be comfortable with how utilities use data and how it is licensed.”

Looking ahead

Finally, Delaney asked the panel what the utility model and its adoption might look like in the next year or so. Taplin said: “We are starting to see banks driving this and having a more coherent view of what they want. With a couple of providers and solutions in the market we will see more people moving to shared services. iGate is optimistic that we will have the critical mass to create something that is genuinely mutualised and shared across the industry within the next 18 months.”

Turso said: “SmartStream will make some large announcements that will progress the utility concept. Meantime, large banks are the early adopters of utilities, but as the model matures the buy side will look at the utility space and start to leverage what has been done on the sell side. Firms want to reduce data integration costs, yet get more data, so they will learn how to leverage the utility model.”

Related content


Recorded Webinar: The post-Brexit UK sanctions regime – how to stay safe and compliant

When the Brexit transition period came to an end on 31 December 2020, a new sanctions regime was introduced in the UK under legislation set out in the Sanctions and Anti-Money Laundering Act 2018 (aka the Sanctions Act). The regime is fundamentally different to that of the EU, requiring financial institutions to rethink their response...


IFRS 9 Methodology and Model Management Post Pandemic

By Mahim Mehra, Senior Risk Advisor, and Yoon Sik Ma, VP, Product Manager, AxiomSL. Complying with the credit-risk analytics and credit- impairment requirements under International Financial Reporting Standard (IFRS) 9 has forced financial institutions to change how they organize, calculate, and examine their credit data. Organizations have had to reconcile data across risk, finance, and...


TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...