If there was one main message to be gleaned from our counterparty data and risk management seminar this week, it was that the current regulatory and risk management focus on entity data is proving to be of benefit to the data management community as a whole. The scrutiny of firms’ counterparty data management as a result of the ongoing MiFID review, new risk reporting requirements and the potential introduction of new living wills legislation, among other things, has catapulted this space to the forefront of senior management’s minds, agreed panellists from Citi, Avox, GoldenSource and Interactive Data.
In the post-crisis world and in the wake of the collapse of institutions as systemically important as Lehman Brothers, the market is hyper-aware of the threat that counterparty risk poses to their bottom lines. Lehman’s collapse in particular brought home to many the data challenge that is inherent in the attempt to assess a firm’s counterparty risk exposure. After all, it took months rather than weeks or days to get a full picture of how far firms had been impacted by the failure of such a large institution.
Geert Berlanger, director in charge of data quality at Citi, noted that living wills legislation in particular would entail more spending on the maintenance of counterparty data with a view to ensuring that the unwinding process is orderly in the event of a default. Regulators will, in fact, require firms to regularly report on their capabilities to be able to conduct a smooth unwinding process in the event of their demise. As such, the C-level is well apprised of the benefits of implementing a counterparty data project and they are a much easier sell than ever before, said Berlanger.
Gert Raeves, senior vice president of strategic business development at EDM vendor GoldenSource, added that the very language being used in the regulatory discussions is helping to raise the profile of the data challenge. The seriousness of the systemic risk impact of the fall of a financial institution cannot be ignored, after all.
Regulators are also keen for a greater level of standardisation to be introduced at a global industry-wide level in order to support their own assessments of counterparty risk. The ISO process has thus far failed to achieve consensus on the subject of an International Business Entity Identifier (IBEI) and the regulatory community is turning to other areas for inspiration. The CESR MiFID review in particular is looking at possibly mandating the use of Bank Identifier Codes (BICs) for entity identification transaction reports, for example.
Some financial institutions have been keen on leveraging investments that have already been made within their organisations with regards to these codes and the BIC is seen by many as a possible solution to the IBEI challenge. However, panellists at our event were not sure about how suitable the BIC was for use as a legal entity identifier in the long run. Berlanger and Kate Young, chief data architect at counterparty data specialist Avox, both expressed their reservations about using the BIC in such a way. Young indicated that the BIC may be suitable in the payments context over the Swift network but extending this too far beyond this remit may not be achievable or desirable for the market. Panellists agreed that the BIC as it stands would need a lot of work to add in extra data elements in order for it to be able to function as an entity identifier.
This subject is likely to be a big fixture at this year’s Sibos and it will be up to Swift to convince the market of the suitability of the BIC in its reworked form.
Across the pond, the US Senate is engaged in debate about a bill that contains something that looks like a reference data utility in the form of the data collection agency as part of the Office of Financial Research. The idea of a utility in the reference data space is a subject that I have blogged about and covered extensively over the last year or so and many readers have also indicated their concerns about the proposals (from the US and from the European Central Bank). Panellists at the event reiterated many of these concerns and indicated that regulatory consensus on the subject needs to be achieved at a global level before progress can be made. The US should not charge ahead regardless.
GoldenSource’s Raeves was sceptical that such a utility would ever get off the ground, given the amount of investment and effort that would be required. Darren Marsh, business manager for risk management and compliance services at data vendor Interactive Data, added that the appetite for public money to be spent on such an endeavour was likely to be low.
However, both Young and Berlanger suggested that should regulators decide to push ahead with the idea, a very basic approach to a utility would be most appropriate. Berlanger indicated that this should be built on standards and protocols that already exist in the market and should be conducted in cooperation with vendors in the space.
Subscribe to our newsletter