Bob Cumberbatch, European business lines director at Interactive Data, speaks to A-Team Insight about his views on the developments within the legal entity identification space, including the lessons learned in the wake of the fall of Lehman. In the third of our series of talking heads on the challenges surrounding entity data management in the current market, Cumberbatch elaborates on the regulatory developments that are impacting the management of entity data, such as Basel III and MiFID.
Cumberbatch has often spoken about the data challenges underlying the risk management function and he sees this as a key area for investment going forward. Here, he also discusses how regulatory change and business imperatives are lending credence to the data management cause, nowhere more so than in the management of legal entity data.
How has the regulatory attention directed at the legal entity identification challenge in the post-Lehman environment impacted financial institutions’ practices with regards to this data?
I believe that regulatory concerns have impacted firms’ practices and there is now a growing understanding that by optimising reference data management – and thereby having an in-depth knowledge of the millions of financial instruments and thousands of related entities flowing through its systems – a firm can take a much more holistic approach to effectively managing risk.
A Committee of European Securities Regulators (CESR) report (The Lehman Brothers default: an assessment of the market impact, CESR, March 2009) stated that the Lehman Brothers group comprised 2,985 entities globally, spanning numerous jurisdictions, with some regulated and others unregulated. CESR noted that global regulatory responses and effective global coordination between supervisors are therefore essential when dealing with such cross border groups. This serves to highlight the complexity of identifying a firm’s exposure, which is especially difficult where the subsidiaries don’t share the same name and entities. The CESR report also states that, “In the future, it is possible that market participants will be more wary of structures where the valuation agent is an affiliate of the issuer: some notes include terms stating that pay off will be determined by a named valuation agent, which poses a particular problem where the agent was a Lehman Brothers affiliate, as the terms didn’t cover the possibility of both issuer and valuation agent going into administration.”
Although no announcement has been made as yet regarding the exact nature of a legal identifier, we know in principle that the intention is to move forward. And of course, until we know what that identifier will be, market participants won’t be able to understand the operational impact of integration into their systems. That said, and judging from the interest we have seen in our Business Entity Service, clients certainly seem to have moved to prioritising the cleansing of their entity data across the enterprise.
Which regulations and compliance requirements are having the biggest impact on this area?
A whole raft of regulations and directives will have a big impact, with the need for full disclosure of the details behind counterparties, issuing entities and their instrument issuance. Liquidity risk directives, such as the UK Financial Services Authority’s (FSA) liquidity risk and stress and scenario testing, Basel III (Capital Requirements Directive), single customer view, large exposures regime, UCITS (IV), MiFID, know your customer (KYC), anti-money laundering (AML), to name some; many of these overlap but it is clear that there needs to be a better understanding of the relationships between entity exposures across the enterprise.
In order to do so firms need to move away from data silos built around specific business functions to combine the disparate data sources that they hold on customers, counterparties, issuers and their underlying issues in order to create a shared single view of entity data and associated risks. Full disclosure of the details behind a firm’s assets and liabilities and the ‘high risk’ overlap between them – is essential.
Given there is currently no industry standard legal entity identifier and the US regulator is looking at mandating its introduction as part of the OFR, what impact will this likely have on the US market? And the rest of the world?
I believe that there is a recognised need for an identifier on a global basis and the industry should be in favour of someone specifying a standard and driving its adoption. The same debate could well have taken place when the ISIN was developed. This will just be the first step and should provide advantages to the wider industry globally, not just in the US. It will facilitate end users to cross map content from various vendors and sources – a form of ‘shorthand’ to identify a company. Sharing a common identifier will benefit processing, automation and reporting.
Interactive Data essentially supports the concept of a standard legal entity identifier as long as it is administered by a not for profit organisation, which couldn’t benefit from its position as administrator to the detriment of other market participants.
A number of options are on the table for such an identifier, what is your feeling for which will be selected as the most appropriate option and why?
As previously mentioned, the organisation offering such an identifier would need to be a not for profit registration authority, which doesn’t create an advantageous position, and doesn’t restrict usage and redistribution. The organisation selected would need to own and maintain the standard – and, most importantly, the allocated identifiers. The identifier would have to meet a broad set of needs and would almost certainly require very wide coverage, as well as being reliable, timely and highly available. I believe that use of the identifier should be free of charge, but value added maintenance should be chargeable. It’s a complex issue as not all entities are issuers or guarantors – the requirements will therefore be granular and needy.
It is clear that the majority of firms would prefer an existing identifier, as the assumption would be that the impact on operational processes would be minimal. However, firms will want to ensure that this identifier can be used to support global entity reporting requirements, as simply introducing another standard in itself will not resolve the issue.
How will all of this impact the vendor community?
I think it’s a huge opportunity for the vendor community and should not be considered as a threat. It will greatly help in automation and help to meet users’ requirements for regulatory reporting – but in the end, it will hinge on the quality, accuracy and timeliness of the data services delivered using the identifier.
Interactive Data takes a neutral position on symbology and we fully support industry initiatives to make instrument identifiers available on a free and unrestricted basis. We believe that this helps to generate greater efficiency across the industry – this is backed up by client feedback. The challenge for data vendors is to supply the symbology that their clients want.
How have counterparty risk management concerns impacted the underlying data management systems within systemically important financial institutions? What level of maturity is the market at with regards to the management of this data?
Exposure risk management concerns are leading firms to integrate and manage their data more closely. Maintaining an accurate and consistent view of entity data available across the organisation is key and many firms are now seeing the need to bring together and integrate the disparate data sources that they hold on customers, counterparties, issuers and underlying issues in order to create a shared single view of entity data and associated risks.
They understand the mission critical connection between central data management strategies and business entity data management, and are integrating their issue to issuer links with the securities master where they can be used for trading analysis, risk assessment and compliance. In addition, internal auditors and risk managers are engaging actively with the data maintained internally, exploring its extent and limitations.
I would say that the overall level of maturity enterprise-wide is still quite junior, but across each level the maturity is quite high.
Are firms largely opting for a centralised approach towards dealing with this data or are the vertical silos across the different parts of an institution persisting?
Many firms see the need for a consistent starting point and a centralised master approach, but it depends on individual circumstances. Having the data all in one place is good for central control, but if data is distributed across an organisation that doesn’t necessarily mean that copies of that data will be ‘different’ – provided that mechanisms exist for ensuring that copies stay synchronised to preserve accuracy.
Firms can work across departmental silos, integrating both internal and external data sources, enriching this data as needed and filtering external data according to internal rules and definitions to eliminate irrelevant data. A wide range of areas need to be addressed: from organisational security – how information is acquired, classified and used within an organisation – to using trusted data sources across all functions within a financial institution.
Is there a degree of disparity in these practices between the buy side and the sell side? Large and small firms?
I don’t think that the need to manage risks in a holistic and enterprise-wide way are sell side or buy side specific. Firms need to manage risks that are commensurate with the business services that they provide. But requirements of scale and timeliness will differ, depending on the firm.
There is a relationship between risk and reward. Understanding and managing the risks involved in the pursuit of higher rewards is not the same as being unaware of the risks. It seems inevitable that the higher the risks involved then the higher the sophistication that is needed in pursuit of these higher returns.
Although different divisions of a firm will have different needs, the goal is to achieve a more effective way of using the data to make informed and timely decisions. This is best tackled by the division or department considering exactly what they need by way of data collection and storage, and how they want this data to be made available to their internal applications. The needs of other departments and the broader organisation as a whole need also to be considered and integrated. It’s the traditional approach of understanding what your data requirements are and then whether to build vs. buy a solution.
What trends do you expect to see over 2011 in terms of market practices in this space?
In 2011, I think we will see more automation, a growing acceptance of a common identifier and a more holistic approach of data management across the enterprise. I also think that there will be more strategic thinking about the processes – building a business case for significant enterprise change.