Following on from last week’s holiday party and end of the year meeting, EDM Council managing director Mike Atkin has produced a 2009 update report on the data industry group’s progress this year. The work of the group has proved sometimes controversial, but the EDM Council certainly cannot be accused of resting on its laurels.
Atkin has led the industry body resolutely down the road of data standardisation at a technical level and has engaged the regulatory community in the discussions to this end. Whether you agree with the group’s methods and rationale or not, it has managed to raise the profile of data management across the industry as a whole. EDM Council is set for more of the same next year, but here’s the update from Atkin for this year in the meantime:
“It certainly has been an “interesting” year. And while there has been some cause for consternation, the spectrum of activities has been (on whole) very good for the objectives of data management. We’ve seen the inside of a large number of regulatory agencies in both the US and Europe. We’ve learned a lot about risk and especially how to look at risk from a system-wide perspective. We’ve seen the requirements for supply chain management emerge as viable global regulatory reform concept. We’re deep in the midst of a proof of concept to demonstrate the viability of tagging source documents at the point of issuance. We completed a large portion of our Semantics Repository and are engaged in a number of modelling exercises across the industry. The objective of a single entity identifier is moving in the right direction. And we launched an initiative to model data management maturity in partnership with Carnegie Mellon University.
Below is a brief summary of the activities that are currently underway. I’d like to take this opportunity to indicate our gratitude for the strong show of support from the industry and encourage you to offer rants, raves and alternative points of view on any of the activities outlined below.
Regulatory Activities
A multitude of regulatory and legislative hearings are currently underway both in Washington DC and Brussels. The precise outcomes of the heated political debates are still uncertain, but the spectrum of topics under review is broad and sweeping. Top of the list is reorganisation of the structure of the regulatory environment and the rise of “financial stability agencies” to deal with the objectives of systemic risk analysis. And while there is agreement on the need for, and core principles of, both regulatory restructure and system-wide oversight – the political distinctions on how it will be implemented has not been reconciled in either the US or Europe.
Embedded in all these discussions is a plethora of large topics. Among them are: (1) the push for transparency and accountability in derivatives – including the concepts of centralised clearing; (2) the future of the securitisation process and how underlying instruments are linked to both derived instruments and related business entities; (3) the nature of the resolution authority (and other tools) to dissolve or dissect systemically important financial entities; (4) the expectation of enhanced liquidity, margin and capital reserve requirements; (5) the harsh reality of restrictions on executive compensation; (6) changes to accounting standards and other methods of improving the valuation of assets; (7) oversight and adjustment of the credit ratings process; (8) timing of exit from the stimulus program and the onward implications for monetary policy; and most important (9) global data harmonisation and standards for data comparability.
This is a pivotal time for the global financial industry because the outcome of these debates will have significant and lasting implications. I’m pleased to report that the data issues and the objectives of source tagging are squarely in the mix and are being recognised by legislators, regulators and market authorities as the foundation of regulatory reporting and systemic risk analysis. As we move toward conclusion of the calendar year, the operative concept for the data objectives is to sit tight and let the global political reorganisation and oversight process sort itself out. We have every indication that the importance of getting the underlying data right (our humble objective) will not be lost in the shuffle.
Standards Activities
This has been a positive year for standards. There is an emerging recognition by both industry and market authorities on our core message that the foundation of data management and systemic risk analysis is based on the unique and precise evaluation of instruments, legal entities, classification schemes and data attributes. We made significant progress in building cooperative relationships with XBRL, Swift, ISO and ISITC. The Council will be applying for voting membership within ANSI X9D and ISO TC68 in 2010 and has the support of ISO to guide us through the process. And we are watching (along with the rest of the industry) the evolution of the debates within EU and elsewhere on the future of the commercial framework for core identifiers.
Entity Identification
The industry’s goal of a single, unique entity identification standard took a positive step forward. And even though the findings of our business case study with Swift were somewhat sobering, regulatory interest in systemic oversight and single name exposure analysis brought the involved parties to the table – resulting in the likelihood of coordinated development within the ISO process.
For those of you who have not been following the discussion . ISO Working Group 8 had narrowed its scope and has been proposing the development of an “Issuer and Guarantor” Identifier (IGI). Identification of issuers and guarantors is clearly needed – but the prospect of “yet another identification code” did not create a happy pathway toward implementation or adoption. At the same time, Swift has been engaged in an internal process to update BIC (flag parent BIC, enrich the BIC directory, validate against official lists, and unravel the multiple BIC situation). Not to mention the proactive work of a few wise vendors who were seeking to solve a real and present business challenge for their customers. From our perspective, it seemed as if the potential for both confusion and fragmentation was growing.
The Council got involved on the request of our members and acted as a facilitation catalyst to bring ISO WG8, Swift and a few regulators/central banks to the table at a November meeting in Frankfurt. The preliminary result was conceptual agreement that both ISO WG8 and Swift were seeking to address the same business requirements and that it would make more sense to work in cooperation with each other. Of course the fact that IGI and BIC were both ISO standards helped propel the discussion in the right direction.
The participants in the meeting are now bringing the cooperation recommendation back to their respective constituencies for consideration. We are feeling positive about the prospect, have confidence that cooler heads will prevail and are encouraged about leveraging the ISO process toward a common solution to this critical objective.
Semantics Repository
I am thrilled to acknowledge the progress that Mike Bennett and his (growing) team of dedicated industry experts have made toward the completion of the EDM Council’s Semantics Repository. I have no qualms admitting that I believe this to be the single most important initiative for both enterprise data management and systemic risk analysis.
Just to put the activity into perspective, the goal of the Semantics Repository is a formal and factual representation of the structural reality of financial instruments (things, facts, definitions, and relationships). The Repository is a complete taxonomy and standard data dictionary defining the legal and contractual facts for all types of financial instruments (from the point of issuance) including all relevant business entity structures and relationships.
The Repository’s value is on the precision of meaning from a business perspective. It is intended as a common language to help financial entities integrate their fragmented data resources or metadata repositories into a single (standard) view across the industry. It helps with the translation between business requirements and technical implementation. It helps firms with the challenge of integrating data sources into processing applications. It provides a common business language for data comparability and reduces the need for costly and error-prone data transformations. And it is a free and open resource funded and developed by the members of the EDM Council for use by the industry at large.
As it now stands, the industry has completed the semantics for static data for all exchange traded instruments. We are now putting the finishing touches on “time and date” terms (market data) and will start reviewing the structure of OTC derivatives early in the New Year. We estimate completion of the core Repository by the end 1Q2010. As part of the validation process, the Council is currently engaged in a number of application mapping projects to verify the use of the Repository for MBS risk analysis, for translation into logical data models, to capture semantic concepts for messaging, and as the basis for internal metadata mapping.
The Council invites you to evaluate the Semantics Repository and run it through its (technical, logical, business structure) paces. We have a standing offer to any company interested in a technical/structural demo (one hour sessions over GoToMeeting). Those interested in a demo of the repository should contact Carole Mahoney. Those interested in help with mapping of the Repository to internal applications or in talking about integrating a semantics model into your environment should contact Mike Bennett.
Instrument identification and CFI
The EDM Council has had limited involvement (this year) with the challenges of instrument ID and classification – but there is activity underway. We note (for example) the positive work on the identification of assed-backed securities (both mortgage and cash) from MERS and the American Securitisation Forum; the conclusion of the second round of testing for the Options Symbology Initiative (February 12, 2010 is final implementation date); the continued promulgation of the Financial Instrument Short Name (ISO 18773) standard; and discussions about revisions to the standard for the Classification of Financial Instruments (ISO 10962).
The CFI revision process is emerging as an agenda item for the Council. A number of our members have approached us with concerns about the complexity of the proposed revisions to cover new types of instruments versus the practical use of the CFI code within financial entities. We have opened up a cooperative discussion with ISITC to help ensure that the requirements of the financial institutions for CFI are captured and validated as part of the revision process. We’ll keep you updated as this conversation evolves.
Supply Chain Management
Management of the financial information chain of supply from the initial point of legal/contractual issuance (when the prospectus, term sheet, corporate action, tender offer is created) has been a common thread among many of our activities over the course of this year.
The premise is straightforward. Much of the data we store in master files, use as factors of input into models and exchange between systems is the result of legal and contractual processes. It is negotiated by, agreed to and documented among the involved parties. In many ways, this data is “perfect” when it is created. After creation, multiple parties independently acquire, rename and transform the data. They sell it to various business units who engage in their own forms of transformation. And those business units reconcile this data into processing applications, transaction instructions and reports.
It is this “curse of transformation” at the various stages of the transactions lifecycle that leads to non-compares, DK’s, expensive trade repairs and manual reconciliation. With better management of the supply chain, the industry could save significant money by reducing the amount of transformation that takes place and regulators would be in a better position to compare reports from multiple financial entities in support of their systemic oversight objectives.
We would suggest that achieving trust and confidence that the underlying data for business processing and regulatory oversight is “fit for purpose,” is the prime directive of the financial information industry. We would further suggest that data supply chain management is possible to achieve in the near term and is an objective that the industry (in partnership with regulators and market authorities) should embrace. The standards are the purview of the industry and are well underway. Participation by issuers is the domain of regulators/market authorities and needs to be harnessed. EDM Council members are working to demonstrate the feasibility of this objective.
Reference Data POC
The EDM Council has teamed up with IBM Research in a Proof of Concept (POC) initiative designed to demonstrate the viability of collecting critical reference data attributes required for systemic risk analysis from the point of financial instrument issuance. The POC supports the objective of supply chain management.
The project focuses on a narrow subset of the financial system – specifically mortgage backed securities, their underlying mortgage assets and their issuing legal entities. It takes advantage of the Council’s Semantic Repository and builds cross domain links to demonstrate the feasibility of the prime directive at the applications/modelling level.
The Semantics Repository provides the ontology of the legal and contractual structure of financial instruments. We have assembled the process flow diagram of the steps in the securitisation process for mortgage backed securities and have documented identification links and gaps. We are creating a semantic map of the reference data attributes to define mortgage backed securities and to map them into risk applications. We are working with various members to create a standard reference data template for MBS (data acquisition interface) and to build the database of MBS data including linkages to the underlying instruments and entities involved in the securitisation process. IBM Research is managing the ad hoc query access and reporting interfaces for extracting required data attributes into standard message formats. And we are testing the process against various real world applications to ensure that the data can be used for systemic risk analysis.
The POC is a limited test step towards developing more complete data repositories for risk oversight. Because the POC will link mortgage, pool, security and legal entity data from different sources, it will provide some insight into how to construct a data repository for a selected area of financial information. It should therefore be viewed as simply one component of a larger repository system.
Note: I have been involved in a lot of initiatives over the years and am impressed with the way this POC is unfolding. Upon its completion we will be able to demonstrate that it is possible to capture critical reference data at the point of creation, using standard semantics, stored in various data management platforms, retrievable via standard messaging interfaces, and able to be used (with confidence) in risk management applications. We are currently working with 10 partners as part of the POC. Any member interested in a full brief (and access to our governance documents) should contact Michael Atkin.
National Institute of Finance
The Council has been providing consultation and advice to the “Committee to Establish the National Institute of Finance” (CE-NIF) on the issues and challenges of providing trusted and comparable data to support systemic risk analysis. We’ve helped create a number of position papers; brought participants up to speed on the status of data standards; advocated supply chain management; described the business case dilemma; and generally tried to demystify the data issue.
We’re impressed with the energy and commitment of the CE-NIF especially with their legislative work to educate policy makers on the importance of data precision (and make it matter) in the midst of political and economic turmoil. Their efforts and accomplishments have been extraordinary.
From my perspective, the NIF goal of getting legislation passed as part of regulatory reform would be icing on the cake. Their real value is in helping to facilitate an important consensus among industry, academia and regulators on the core objectives of data management. Their success in getting the National Academy of Sciences to investigate and report on gaps in data and analytics available to financial regulators is a case in point. So, regardless of what happens on the legislative front, I believe the CE-NIF should be congratulated for helping to raise awareness on the importance of getting the underlying data right. And when the dust settles and the systemic regulator is established, they will still need accurate, comparable and trusted data to perform their function. That’s when the real work begins.
Data Management Maturity
Of all the projects on our 2009 plate, none has attracted more interest and enthusiasm than the initiative we’re doing in partnership with the Software Engineering Institute (SEI) of Carnegie Mellon University (CMU) to model data management maturity.
To set the stage… the Council has long been engaged in pursuit of both the ROI for EDM and in the collection of best practices on its implementation. Two issues quickly emerged in the process. The first is that (in many respects) we have business case travails because data management is not viewed in an integrated and composite way. We view and fund it as a series of unrelated and discrete steps (based on short term measurement criteria), rather than as a comprehensive and integrated activity. The second is because managing data as content at the enterprise level is not something we know how to do all that well. Most of those charged with EDM are “figuring it out as they go” and doing so while dealing with hosts of operational brushfires. In essence, EDM is both hard to justify and hard to accomplish.
These two realities suggest that EDM is not a very mature activity (something we all know from experience). And in talking about the concepts of data management maturity, I was plugged into something called the Capability Maturity Model (CMM) developed for the Department of Defense by the brains at Carnegie Mellon. The Council opened up a dialogue with CMU and both sides quickly agreed to the importance and opportunity of building a data management maturity model based on a modified (less rigorous) version of the CMM methodology and using the data management knowledge and expertise of the financial industry. And so the partnership was born.
The goal of the Data Management Maturity Model (DMM) is to define the components of data management at the specific business-process level so that financial organisations can assess themselves against documented best practices and upgrade their management of essential data resources. It can also be used to provide a consistent and comparable benchmark for regulatory authorities (and financial entities) in their efforts to control operational risk.
Over the past few months we have set up our organisational structure, areas of delivery and funding requirements. We have produced a base model defining the categories, components and business processes of EDM. We have created a baseline document defining the characteristics of the various levels of data management maturity. And we have opened up discussions with various regulatory agencies about the funding requirements and have started to engage with leading research universities on collaborative activities. We are currently working on a revised version of the base model that will be used as the basis for development work with our members in the initial stages of the project. We expect to have the revised model ready for release in January 2010.”
Subscribe to our newsletter