Progress with industry-level efforts to create a universal standard for entity identification may be going nowhere fast, but this is not preventing financial institutions from undertaking projects to get their counterparty data in order. As a result, counterparty data is fast losing its reputation as the poor relation of the reference data family.
As Mike Destein, director of industry solutions for Siperian, provider of master data management solutions, observes: “Firms have traditionally not been forced for one reason or another to store and track information by the legal entity and its associated global hierarchy. Without a requirement there hasn’t been much rigour on managing counterparty data. That’s changed over the last few years and the reason for the change is either for compliance, risk management or just general good housekeeping practices.”
Although firms are still taking an individual approach to conquering existing problems with counterparty data, the industry as a whole seems to be heading in the same direction, with most participants believing that centralised solutions and “golden copy” data sources are the way to go.
Some elements of business entity data must be managed centrally if the firm wants to get a true global picture of its business, according to Ken Price, CEO of counterparty data provider Avox. He believes that the benefits of centralised data management include, as well as regulatory compliance, major efficiency gains, improved understanding of cost, credit netting, a single client view and “strategy enablement”.
“One of our clients penned this term,” he says. “Their experience has proven that a central data management function improves consistency, quality, completeness and timeliness of business entity data. This data feeds almost every system in the bank. Every new system taps into this central database making implementation faster, cheaper and less risky.”
At the core of a centralised solution many people believe a “golden copy” approach is required because of the history of data inconsistencies and inaccuracies that led to incomplete (or wrong) consolidations, out-of-date information and consequently exposure to regulatory risk, trade and settlement risk and reputational risk. Traditionally, counterparty data has been managed by specific business units, thus creating data conflicts and duplications. As the drivers discussed above have increasingly put pressure on firms for centralisation, the need for a golden copy approach has become ever clearer.
Richard Stumm, vice-president of business development, Broadridge Financial Solutions (which is extending its managed reference data solution from instrument data into counterparty data, Reference Data Review, August 2007), says: “The golden copy approach ensures that the data being used across the enterprise is consistent, but more importantly enables firms to demonstrate that the controls and processes for gathering and updating counterparty information are consistent across the enterprise as well.”
Adds Martin Cole, head of product management for counterparty data specialist CounterpartyLink: “I think that organisations are starting to realise that legal entity data in its widest sense is used in so many downstream systems that if you can collect it once and keep it clean then you’re going to get significant efficiencies later on.”
Having adopted this approach at Bank of Scotland, David Miller, director of credit systems for the bank, agrees that you need one source of the truth. “If you don’t have one source of the truth then potentially you are not compliant in relation to Basel II. Also, without a single source you have increased inefficiencies because you have more people doing similar tasks. If you have one source of the truth whereby you can even pre-populate information you can reduce the amount of handlers.”
Paul Kennedy, vice president, product management, GoldenSource, agrees a centralised solution is vital. “I don’t know how you get business insight if you don’t have a centralised approach; the paradigm of data is about context as well as a number of other things that you always need – completeness, conformity, consistency, how up to date it is, duplication and integrity. All of those requirements to get high quality data are better achieved well with a centralised solution.
“High quality data is not just a costly regulatory requirement – it’s actually a competitive advantage and I think lots of people now are beginning to see that and hence there is a groundswell towards this whole concept of managing your data across the enterprise,” continues Kennedy.
Many people in the industry agree that regulatory compliance is a key driver of the current boom in investment for counterparty data, but a growing number of additional benefits are driving an increased level of senior management focus and investment.
Michael Smethurst, principal consultant and director for Adsatis, joint venture partner in CreditDimensions, a provider of data management services for reference data and counterparty hierarchy, sees clear ways that counterparty data can be a source of higher net revenue: fewer failed trades, reduced interest charges, reduced costs, increased client satisfaction and more business. “There is a potential virtuous circle here. It is also becoming an increasingly important component of the client management function, that is, understanding the value of client relationships,” he adds.
So what will help organisations realise these benefits? Some would say that outsourcing is the way to go, especially for those that have yet to establish a complete central data management function, as this process can be costly and there is risk associated with not doing it right. This is now a section of the reference data spectrum that’s becoming more mature, so the requirements and standards are well understood and the range of options for outsourcing all or part of the business requirement are becoming more widespread.
“Outsourcing the supply and maintenance of legal entity data can bring a range of advantages,” believes Cole. “Specifically, that you’re going to get better consistency in data collection and hopefully higher quality because the people that are doing the job are generally going to be more experienced and absolutely focused on doing it. The cost will be more quantified because the vendors concerned will agree to specific quantity and quality for a fixed cost to a fixed SLA.”
Mike Atkin, managing director of the EDM Council, a not-for-profit organisation addressing the business strategies and practical implementation realities of enterprise data management, thinks that outsourcing cleansing and validation makes sense. “Avox’s model for example is a good one, what CounterpartyLink is doing is outstanding so it makes sense to outsource these aspects of an organisation’s counterparty data. What I’m unclear about is whether it makes sense to outsource your hierarchical relationships and integration with your internal applications.”
There are also clear disadvantages to this approach for some companies. For instance, Miller at Bank of Scotland warns outsourcing can leave a firm “open to operational risk, because what you’re doing is passing out your key data or golden source to another organisation who may be treating it like any other third party data, but they don’t have the same drivers as the organisation itself. Also you then get into the technicalities and communication difficulties, not just verbal communication but between different sites, different timescales.”
Industry wide standards are also an element to be considered. Atkin, a long-time advocate of standards, believes there is absolutely a requirement for a standard to achieve consistency across the industry. “Standards help promote consistency, transparency and comparability. That’s why they are useful and important. When you cut right down to it, data management starts with your ability to uniquely and precisely identify instruments, legal entities and data elements themselves, and those three areas are essential,” he explains.
“A standard identification scheme for legal entities allows you to manage your hierarchies in a consistent way throughout your organisation, allows you to integrate multiple internal and external feeds in a consistent way and gives you high quality rich data. All we’re talking about is an identifier so that you can build value hierarchical relationships from that in a consistent fashion. It is absolutely critical that we do this.”
But this may be easier said than done on an industry-wide basis, warns Tony Spensieri, vice president of financial services for Siperian. “There have been a lot of different standards organisations formed in the various entity data sets. Everybody talks about it, and everybody thinks it’s important to get everybody on the same page to drive towards a corporate standard, but it’s a very difficult challenge.”
Indeed, the mixed fortunes of different industry-level activities to create a single standard for entity identification, thoroughly explored in earlier issues of Reference Data Review, demonstrate the difficulty of the challenge. Wendy Kassel, vice president, program management at Cicada, reckons that centralised standards are noble and ideal goals, “but have rarely met with practical or timeless success”. “At the risk of promulgating data management heresy, no single institution will ever improve its bottom line because it adhered to an industry standard created by a trade association,” she adds.
Suggesting a pragmatic way to proceed in the absence of a universal standard for counterparty data, Kassel continues: “What is required is for data vendors, as well as technology providers of solutions such as counterparty masters, to provide clear data dictionaries of the data they provide or support,” she says. ”Also needed are tools that support normalisation and provide data modeling flexibility, as well as user-oriented rules; these can allow institutions to acquire third-party data and transform the data according to the specific business requirements of any of its consuming applications. Such tools exist today, and can be used to great effect while waiting for an industry standard to be defined, implemented and adopted.”
This article is extracted from The rise and rise of counterparty data, which appears in the Q4 issue of Reference Data Review’s sister publication A-Team IQ, available free to all subscribers. Email firstname.lastname@example.org to request your copy.
Subscribe to our newsletter