About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Standards Hobgoblins

Subscribe to our newsletter

As well as the discussions around entity identification, this month’s FIMA featured a number of serious debates about the benefits of a data utility approach at an individual firm and industry level. Panellists and speakers generally fell into two camps: those wary of adopting such an approach and those fully convinced of its benefits at a firm and industry level. Falling into the former camp, Ian Webster, global head of data management for UBS Asset Management, noted that a utility approach is not always the most sensible choice and said that standardisation should only be applied where necessary, calling standards the “hobgoblins of small minds”.

Webster, who also elaborated on UBS’ own approach to the data management challenge, contended that if the regulatory community hopes to enact the changes it has planned over the next months related to data standardisation (including getting the Office of Financial Research up and running), then it will have to “change the laws of physics”. At an industry-wide level, he believes the imposition of standards must be handled with caution, given the different requirements of various business functions. He noted that for Basel II or III compliance, certain data standards may make sense but these could not be used for all other purposes.

At a firm level, Webster said: “Why build an aircraft hanger when all you need is a helicopter?” Data management projects therefore do not need to be enterprise-wide in scope; they need to meet the requirements of the business without being prohibitively expensive. Fellow panellist Chris Johnson, head of product management, market data services at HSBC Securities Services, seconded the notion of keeping data management projects business focused rather than charging ahead with standalone data programmes.

Webster highlighted the example of meeting the needs of the trading desk versus the finance function to illustrate his point. “These two functions have two different sets of requirements and speeds at which they work, which means that trying to meet all their data needs would cause schizophrenia,” he said. “This does not mean you should throw out the enterprise structure for data management, just apply it where it makes sense to.”

The rationalisation of vendor data feeds across an organisation was agreed upon by many speakers as a quick win for bringing down data costs (a theme that has been persistently raised at nearly every event over the last few years). But, unlike UBS’ David Berry, Webster was not convinced that a utility approach will rid the community of proprietary formats. Johnson also compared data standards to barnacles embedded within systems and therefore indicated that they are very difficult to remove.

Webster contended that the goal of one set of standards and one “golden copy” is unachievable: “There doesn’t need to be a single view, it just needs to be transparent. Get over the idea of a golden copy; we can’t do away with reconciliation, it is a utopian view.” Cross-referencing between standards is therefore the way forward, according to Webster, who noted that a “skinny” identifier might work in a utility context, but would still likely run into challenges.

Political barriers are standing in the way of a global reference data utility that could provide this “skinny” identifier, which would be limited to identifying an instrument rather than rolling up to provide any other details. Webster pointed to the “massive amount of silence” coming from the G20 in Korea as proof that regulators are not on the same page.

Liability is also an issue, he said: “That is what is holding back the commoditisation of a relatively easily outsourced function. It is also a tough sell for the vendor community.”

These are certainly similar themes to those that have been raised time and time again over the last year or so on the topic of a reference data utility. And it is unlikely these conversations will move much further until progress is made in the US with the developments around the Office of Financial Research. Let’s hope by the time our own conference – Data Management for Risk, Analytics and Valuations in New York City on 17 May 2011 – rolls around, there is much more progress to discuss.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Detecting and preventing market abuse

25 March 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Market abuse – unlawful disclosure of inside information, insider trading, circular trading, “pump and dump” schemes, etc. – poses significant threats to the integrity of capital markets. In 2024, global trading house Trafigura agreed to pay a $55 million fine to...

BLOG

RegTech Summit London Delivered the Goods – Here’s What You Missed

By Reena Raichura, Founder of Finergise, Senior Fintech and Capital Markets Executive, FTSE 250 NED. As a strategic advisor, NED and capital markets SME, as well as keeping on top of all the cool technology and innovation that’s out there, I like to keep updated with the latest regulatory and compliance changes, so I was...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...