About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Sarah Underwood: Solutions to Solve the Data Management Conundrum

Subscribe to our newsletter

As in-house reference data management models near breaking point, talk is turning to hosted solutions, managed services and data utilities. These solutions could provide some of the cost reductions that firms must find, but they do not come without their own problems of data quality, responsibility for data, timely collection and distribution of data and, put simply in the case of utilities, how to build and operate a shared reference data solution.

These issues and more were discussed at this week’s A-Team Group webinar, Hosted/Managed Reference Data Services and the Emerging Utility. A-Team Group editor-in-chief Andrew Delaney moderated the webinar and was joined by panellists Nick Murphy, senior consultant of data managed services at iGate Global Solutions; Paul Kennedy, a sales and marketing executive at 1 View Solutions; and Nick Taplin, a senior pre-sales executive at SmartStream.

Delaney set the scene with a description of today’s data management difficulties, including the data implications of new regulatory requirements, onsite data management systems that are reaching breaking point and data and data management resources that are constantly squeezed. Considering whether a hosted, managed or utility approach could resolve these difficulties, he questioned whether a reference data utility could be the holy grail of data management?

Taplin answered: “A utility should offer standard services, with users taking as much or as little data as they need. The service should be as neutralised as possible, but the provider must recognise the need for some client specificity.”

How such a utility, or a managed service, should be built and who should own it was a moot point. Murphy described iGate’s approach, saying: “We are trying to create more standardisation in the market to reduce data management costs, so we are working with participants on this. We have a core set of clients termed design partners. They will define elements of the utility, so, in this sense, the industry has intellectual ownership of the utility.”

Taplin concurred, saying: “A utility needs to be a community operation rather than a service ‘from us to you’. Users of a utility could be both data providers and recipients, and could help us design data management methodologies. We would then run the utility for the benefit of all participants.”

Addressing the absolute need to change today’s data management model, Murphy said: “Cost is a big driver. We need to improve the whole data lifecycle and be more selective. A utility, managed service or whatever you call it could help clients react quickly and at lower cost to regulatory change. A utility could be faster to react than banks, be leaner and cleaner and offer better quality data and data management.”

Picking up on the issue of quality, Kennedy said: “The definition of quality is accurate and timely data. There has to be a way to share information in the same way as public data is shared. If data was shared collaboratively, it would be possible to change the way the data is managed.”

With some ideas of how a utility could operate on the table, Delaney questioned what types of data could be shared and which organisations would supply the data. The panellists agreed that a utility would not own data, but simply manage it, with data being provided by market data vendors and perhaps banks that could publish their own data through a utility. Master data, pricing data and regulatory data such as Legal Entity Identifiers were cited for possible inclusion in a utility, with the emphasis on improving data quality and access, rather than piling it high to provide a one-size-fits-all solution.

As the utility model begins to take shape, Delaney considered how it would continue to develop over the next 12 to 18 months. He suggested a utility style data solution will only succeed if there is a shift in mindset about data management in the market.

Kennedy agreed, saying: “We need a radical change in mindset. The utility debate will continue and as there is more regulatory change, there will need to be more data management change, but this will be hampered by cost and time. Whatever the solution, if it can overcome these issues and improve data quality, the industry will adopt it.”

Pressing the case for a utility, Taplin said: “It’s 2014 and firms are being fined left, right and centre. We can’t go forward like this. I think firms would like to start using utilities over the next 12 to 18 months.” Murphy concluded: “As the market gets used to data utilities we will see a mindset change. In two years’ time we will see a very different data landscape, with utilities providing business focused data frameworks.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Best Practice Approaches to Trade Surveillance for Market Abuse

Market abuse is a problem, a very big problem for financial institutions that fall on the wrong side of regulation. Penalties include eye-watering fines, reputational damage and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours, a lack of trust and the potential need for significant...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...