Data management has progressed beyond its infancy and is now well into its awkward teenage years, where the industry is standing at the point of delivery, moving from planning to execution, according to Mike Atkin, managing director of data management industry group the EDM Council. As part of its recent industry benchmarking exercise, which involved feedback from 61 financial institutions, the group found that although 80% of respondents had an executive sponsor in place for their data management projects, a silo mentality is still prevalent within the market and data scrubbing continues to be commonplace.
“The majority have centralised the governance of their data management and have executive sponsors in place, most of which are named executives at the C level. However, there are still a lot of challenges to be overcome including a silo mentality within these firms and a lack of cooperation on an enterprise-wide basis,” explains Atkin. “We have not yet managed to achieve data management processes without reconciliation and there is still a lot of scrubbing going on due to the fragmented data supply chain.”
Atkin reckons the industry has gone a long way towards moving from decentralised data processes to centralisation over the last three years. He also notes that the buy side and the universal banks are a lot further ahead of the sell side in this respect; a fact which the recent benchmarking survey highlighted also.
“Standards are in vogue but not yet a reality in terms of implementation,” he continues. “There is an understanding that standards must be at the foundation but the infrastructure is not yet in place to fully realise these new standards. This will continue to be a big problem for the industry as it comes under fire from the regulators and financial institutions themselves.”
Atkin is confident, however, that the “perfect storm” of regulatory change focused on tracking systemic risk and an internal financial institution focus on identifying risk and opportunity in the market will combine to drive the industry agenda forward. “The top line is that there is increased focus on data issues in order to be able to perform systemic risk analysis, meet reporting requirements and respond to business requirements,” he contends.
Another contributing factor to this focus has been the shift from data management being owned by IT to it residing in operations. Atkin reckons there is now general understanding that moving from planning to execution of a new data management strategy will require a partnership between IT, operations and the business.
However, given that the respondents involved in the benchmarking survey were handpicked by the EDM Council, element 22 and Headstrong because of their prior engagement in data management projects, these results must be put into this context. Atkin also notes: “Sometimes the data did not match the actual circumstances that we knew to be the case within the firm due to different interpretation of certain questions, so we had to eliminate some of these responses.”
Although they might not be fully representative of those that have yet to embark down the data management road in earnest, statistics of this nature are valuable to some degree in providing a general guide to how far many other have come. Given that industry participants were also engaged in providing feedback around the questions that should be asked in the survey, it also highlights where the industry is keen to hear more.
Integration, in particular, seems to be a key area of concern, notes Atkin. “The main challenges remaining are the problems of a silo mentality, a lack of standard data definitions, the need for centralisation processes beyond vanilla instruments and counterparty and entity data. Policies and procedures around EDM have not been formalised often and this poses a big problem,” he adds.
The EDM Council itself has been keen to get involved in tackling some of these standardisation challenges, most notably with its semantics repository and the Data Management Maturity Model (DMM), which is currently seeking funding. In fact, Atkin indicates that the survey was aligned with the DMM in order to provide “finger on the pulse” feedback into the model so that it is able to be consistent with market best practices.
“The DMM is still engaged in fundraising at the moment and progress will be incremental until we get the resources in place to move forward with full scale implementation. We are talking to regulators, academics, systems integrators and vendors about the DMM and I am feeling good about where we have managed to get to so far,” elaborates Atkin. “There are a lot of things to pull together as the DMM is just an outline for definitions of the underlying (albeit detailed) components of data management at the moment and there are no formal best practices defined yet.”
However, he indicates that members are using the DMM as a guideline for evaluating their internal EDM strategies. “They are also taking advantage of the semantics repository to help them manage their standards alignment processes. For example, it helps them to reconcile the way in which different parts of their business look at the components of various instruments and ensure the consistency of these definitions,” he adds.
A number of members are therefore using the repository as a basis on which to structure the upgrading of their own metadata repositories. The logic being that they are able to more easily deploy common data models within their institutions or consolidate their existing data resources by using the attributes contained within the semantics repository itself.