The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Standards Hobgoblins

As well as the discussions around entity identification, this month’s FIMA featured a number of serious debates about the benefits of a data utility approach at an individual firm and industry level. Panellists and speakers generally fell into two camps: those wary of adopting such an approach and those fully convinced of its benefits at a firm and industry level. Falling into the former camp, Ian Webster, global head of data management for UBS Asset Management, noted that a utility approach is not always the most sensible choice and said that standardisation should only be applied where necessary, calling standards the “hobgoblins of small minds”.

Webster, who also elaborated on UBS’ own approach to the data management challenge, contended that if the regulatory community hopes to enact the changes it has planned over the next months related to data standardisation (including getting the Office of Financial Research up and running), then it will have to “change the laws of physics”. At an industry-wide level, he believes the imposition of standards must be handled with caution, given the different requirements of various business functions. He noted that for Basel II or III compliance, certain data standards may make sense but these could not be used for all other purposes.

At a firm level, Webster said: “Why build an aircraft hanger when all you need is a helicopter?” Data management projects therefore do not need to be enterprise-wide in scope; they need to meet the requirements of the business without being prohibitively expensive. Fellow panellist Chris Johnson, head of product management, market data services at HSBC Securities Services, seconded the notion of keeping data management projects business focused rather than charging ahead with standalone data programmes.

Webster highlighted the example of meeting the needs of the trading desk versus the finance function to illustrate his point. “These two functions have two different sets of requirements and speeds at which they work, which means that trying to meet all their data needs would cause schizophrenia,” he said. “This does not mean you should throw out the enterprise structure for data management, just apply it where it makes sense to.”

The rationalisation of vendor data feeds across an organisation was agreed upon by many speakers as a quick win for bringing down data costs (a theme that has been persistently raised at nearly every event over the last few years). But, unlike UBS’ David Berry, Webster was not convinced that a utility approach will rid the community of proprietary formats. Johnson also compared data standards to barnacles embedded within systems and therefore indicated that they are very difficult to remove.

Webster contended that the goal of one set of standards and one “golden copy” is unachievable: “There doesn’t need to be a single view, it just needs to be transparent. Get over the idea of a golden copy; we can’t do away with reconciliation, it is a utopian view.” Cross-referencing between standards is therefore the way forward, according to Webster, who noted that a “skinny” identifier might work in a utility context, but would still likely run into challenges.

Political barriers are standing in the way of a global reference data utility that could provide this “skinny” identifier, which would be limited to identifying an instrument rather than rolling up to provide any other details. Webster pointed to the “massive amount of silence” coming from the G20 in Korea as proof that regulators are not on the same page.

Liability is also an issue, he said: “That is what is holding back the commoditisation of a relatively easily outsourced function. It is also a tough sell for the vendor community.”

These are certainly similar themes to those that have been raised time and time again over the last year or so on the topic of a reference data utility. And it is unlikely these conversations will move much further until progress is made in the US with the developments around the Office of Financial Research. Let’s hope by the time our own conference – Data Management for Risk, Analytics and Valuations in New York City on 17 May 2011 – rolls around, there is much more progress to discuss.

Related content

WEBINAR

Upcoming Webinar: Fighting fraud and financial crime with RegTech

Date: 24 June 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial fraud and crime continue to escalate causing significant damage to companies, countries and the global economy despite enormous efforts by firms and organisations in the financial services sector to identify and expel bad actors. As these bad actors...

BLOG

Acin Enhances Operational Risk Control Platform

Acin has enhanced its SaaS-based operational risk control platform, adding a new Acin Score facility as well as forward-looking scenario analysis and emerging risk assessment capabilities. The enhancements seek to address financial institutions’ need for a holistic approach to operational risk to mitigate against expensive process failures, financial losses, regulatory penalties and reputational damage. The...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...