As well as the discussions around entity identification, this month’s FIMA featured a number of serious debates about the benefits of a data utility approach at an individual firm and industry level. Panellists and speakers generally fell into two camps: those wary of adopting such an approach and those fully convinced of its benefits at a firm and industry level. Falling into the former camp, Ian Webster, global head of data management for UBS Asset Management, noted that a utility approach is not always the most sensible choice and said that standardisation should only be applied where necessary, calling standards the “hobgoblins of small minds”.
Webster, who also elaborated on UBS’ own approach to the data management challenge, contended that if the regulatory community hopes to enact the changes it has planned over the next months related to data standardisation (including getting the Office of Financial Research up and running), then it will have to “change the laws of physics”. At an industry-wide level, he believes the imposition of standards must be handled with caution, given the different requirements of various business functions. He noted that for Basel II or III compliance, certain data standards may make sense but these could not be used for all other purposes.
At a firm level, Webster said: “Why build an aircraft hanger when all you need is a helicopter?” Data management projects therefore do not need to be enterprise-wide in scope; they need to meet the requirements of the business without being prohibitively expensive. Fellow panellist Chris Johnson, head of product management, market data services at HSBC Securities Services, seconded the notion of keeping data management projects business focused rather than charging ahead with standalone data programmes.
Webster highlighted the example of meeting the needs of the trading desk versus the finance function to illustrate his point. “These two functions have two different sets of requirements and speeds at which they work, which means that trying to meet all their data needs would cause schizophrenia,” he said. “This does not mean you should throw out the enterprise structure for data management, just apply it where it makes sense to.”
The rationalisation of vendor data feeds across an organisation was agreed upon by many speakers as a quick win for bringing down data costs (a theme that has been persistently raised at nearly every event over the last few years). But, unlike UBS’ David Berry, Webster was not convinced that a utility approach will rid the community of proprietary formats. Johnson also compared data standards to barnacles embedded within systems and therefore indicated that they are very difficult to remove.
Webster contended that the goal of one set of standards and one “golden copy” is unachievable: “There doesn’t need to be a single view, it just needs to be transparent. Get over the idea of a golden copy; we can’t do away with reconciliation, it is a utopian view.” Cross-referencing between standards is therefore the way forward, according to Webster, who noted that a “skinny” identifier might work in a utility context, but would still likely run into challenges.
Political barriers are standing in the way of a global reference data utility that could provide this “skinny” identifier, which would be limited to identifying an instrument rather than rolling up to provide any other details. Webster pointed to the “massive amount of silence” coming from the G20 in Korea as proof that regulators are not on the same page.
Liability is also an issue, he said: “That is what is holding back the commoditisation of a relatively easily outsourced function. It is also a tough sell for the vendor community.”
These are certainly similar themes to those that have been raised time and time again over the last year or so on the topic of a reference data utility. And it is unlikely these conversations will move much further until progress is made in the US with the developments around the Office of Financial Research. Let’s hope by the time our own conference – Data Management for Risk, Analytics and Valuations in New York City on 17 May 2011 – rolls around, there is much more progress to discuss.
Subscribe to our newsletter