The EDM Council has this month indicated that it is considering a new fee-based service that would provide members with data quality rules that firms could embed in their quality review processes. The idea has gone to the association’s board for approval and if it receives the green light, will be discussed with the membership at large. The association is also working on the final stages of its proof of concept for interest rate swaps (IRS) to prove the benefits of a semantics repository to the US regulatory community for its data standards agenda.
The data quality initiative builds on the work that the industry association has been engaged in around benchmarking the state of the data management industry. Mike Atkin, managing director of the group, explains: “We are looking to see the appetite of the market for the development of shared data quality rules as part of our new initiative in that space. The idea is that financial institutions and vendors will be able to embed verified data quality business rules into their quality review processes on a consistent basis.”
He continues: “Our starting point is around 300 core attributes and we are contracting with a senior data quality expert to document and verify the relevant rules. The core attributes will be synchronised with the definitions within our semantics repository. The final data quality rulebook will be a fee-based service that is licensed to members of the EDM Council and vendors.”
Is there an appetite out there for such a service, given the tension between the restricted budgets of many and the pressure to set adequate data quality rules in place? For now, the EDM Council’s board is tasked with determining whether the project will get the go ahead, but the group will soon be consulting its membership on the subject (should the green light be granted).
In the meantime, the group has plenty to be getting on with. The proof of concept work is all tied into the work of the Commodity Futures Trading Commission (CFTC) and the Securities and Exchange Commission (SEC) to determine the feasibility of adopting new algorithmic codes to identify complex and standardised derivatives. The findings of a study conducted by the regulators earlier this year indicate that the regulators believe current technology is capable of representing derivatives using a common set of computer readable descriptions and that these descriptions are precise enough to ID “at least a broad cross section of derivatives,” but a few other items such as standardised entity identification must be tackled first.
Atkin explains: “The proof of concept work is to demonstrate to the regulators that semantics is the right route to go down rather than point solutions. It will allow for the better integration of standards and is not tied to a particular technology. We have now focused solely on IRS due to time pressures on the proof of concept and are using real data inputs. The 30 June is the deadline for this work and we are currently converting the OTC derivatives part of the repository into RDF/OWL.”
In order to formalise a lot of the group’s work and be able to convert the semantics repository into this format, earlier this year it joined efforts with the Object Management Group (OMG). The tie up was also aimed at ensuring the work remains sustainable by providing a formal governance framework in which the repository can sit. The semantics repository has also been given a new name to indicate the formalisation of this process: the Financial Industry Business Ontology (FIBO).
To this end, Atkin explains: “EDM Council is continuing to work with the OMG on the FIBO, which we believe is key to managing the data supply chain. The OMG is helping us to turn the semantics repository into a technical meta-model and we are aligning the work with that of the International Swaps and Derivatives Association (ISDA) and its XML schemas.”
ISDA has been particularly vocal over recent months about the potential of its own Financial product Markup Language (FpML) standard to identify derivatives instruments in a centralised product registry. By tying its efforts to those of ISDA, EDM Council is therefore hoping to increase its chance of success within the regulatory sphere.
However, Atkin notes that the regulatory environment itself is a tough place to navigate through at the moment. “Individual regulators in the US and in other national markets are not yet coordinated and the overlapping deadlines of many of these regulations is causing confusion. For example, the CFTC’s objectives are about improving market transparency not facilitating systemic analysis (unlike the Office of Financial Research, or OFR) and therefore involve different priorities. The regulators are now trying to align those objectives.”
There has also been impact from regulatory politics and the fact that those working on these standards have to ‘sell’ everything they do to the commissioners in charge, which puts pressure on deadlines, adds Atkin. “There is the curse of short view of regulators, which is putting pressure on coordination efforts and the balance between choosing the right solution versus a quick solution. Some of these activities are also new and regulators are facing political pressure to choose the ‘safe’ option,” he explains.
For now, US OTC market reforms have most immediate deadlines and are “driving the train” with regards to data standards, he says. But what happens if this train gets derailed?
Atkin prefers to retain his positive outlook with regards to progress of these reforms: “The EDM Council’s advice is for firms to prepare by adopting the new standards as soon as possible and integrating these into their data repositories, processes and workflows. This is likely to involve a lot of reengineering for some firms, who will also need to prepare for ad hoc reporting across data silos.”
He adds that the practice of data management will be the next item under scrutiny by the regulators, including checks of firms’ data quality validation practices. Firms will therefore need to demonstrate providence over the data they are using and its input into the decisions they are making (hence the group’s endeavours with its data quality rulebook).
Rather than acting alone, the regulators are looking to work with industry to develop the correct criteria for assessment of these practices. The legal entity identification work of the OFR is therefore a first step in this process, and Atkin is confident that it will not be delayed (in spite of regulatory trials and tribulations in the background). “The LEI will definitely be introduced on a phased basis and the deadline of 8 July for the delivery of the decision on the standard is still in place. Over 20 firms have responded to the solicitation of interest and the group of 65 industry representatives led by Sifma is currently engaged in quantitative and qualitative assessments of these proposals,” he says.
The OFR has also been adding new faces to the team and six positions have now been filled, although a director is yet to be appointed. Atkin reckons this will happen before the end of this month. Watch this space…
Subscribe to our newsletter