About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Plans January Release of Version 1.0 of its Data Management Capability Model

Subscribe to our newsletter

The Enterprise Data Management (EDM) Council will release version 1.0 of its Data Management Capability Model (DCAM) on January 15, 2015. The model defines the scope of capabilities needed to establish, enable and sustain a mature data management discipline, including strategy, organisational structure, technology architecture and operational best practices.

The council’s DCAM model builds on the Data Management Maturity Model that was released to EDM Council members as a draft model in April 2013. This model was initially developed in conjunction with the Software Engineering Institute, part of the US Department of Defense, and then with the Capability Maturity Model Integration (CMMI) Institute. Since the release of the Data Management Maturity Model, the EDM Council and CMMI have gone separate ways, with the CMMI continuing to administer the Data Management Maturity Model and the EDM Council developing DCAM.

In the wake of this change, the EDM Council is mapping the April 2013 draft of the Data Management Maturity Model to DCAM in an effort to help members using the draft as the benchmark for their data management activities to migrate. It says migration should not be difficult as the models use consistent concepts and only the orientation of DCAM has been changed to reflect industry interest in capabilities required for data management, which were not included in the previous model. To ensure support for migration, one of the EDM Council work streams developing DCAM is dedicated to bridging any gaps between the models.

While DCAM adds to other data management models in the market and those developed in-house by finance firms and by consultants, the council says it will not cause confusion, but help firms deal with the realities of data management.

Essentially, the model comprises eight components covering the spectrum of requirements for effective data management. The headline components are: data management strategy, business case and funding model, data management programme, governance management, data architecture, data quality, technology architecture and data operations. Each component includes a short narrative describing its meaning and includes specific capabilities and objectives. Scoring of outcomes is based on three core concepts: degree of engagement, including who is involved and at what level; formalisation of practices, such as documentation and alignment with budget cycles; and level of activity, essentially evidence based output.

Development of the model is being led by John Bottega, senior advisor to the EDM Council and a former chief data officer at the Federal Reserve. He leads regular working sessions with council members and explains: “We have reoriented our work to identify the capabilities that are need to show governance, control and provenance of data. DCAM is not prescriptive, but based on principles, and it is not just a book of best practices, but a working model that includes objectives and measures of their achievement. It also challenges data management pain points such as the complexity of legacy systems, inconsistent definitions and duality.”

The council has pitted DCAM against a number of regulations, including BCBS 239. The model met all the data management requirements of BCBS 239 and could also be a viable framework for firms to fulfil the requirement to assess the capability of their data management processes.

Considering the adoption of the DCAM model, Mike Atkin, managing director of the EDM Council, says: “Since we made a draft of DCAM available in September 2014, it has been downloaded 800 times by our membership of 140 companies.” Looking forward to 2015 and the release of DCAM version 1.0, Bottega notes that the council is talking to sectors other than financial services about using the model and is continuing to work with members to deliver version 1.1 in the second quarter of the year.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Asset Managers Identify Top Data Management Challenge as Eliminating Errors

Market volatility, rising interest rates, and fee and margin compression are causing decision makers at asset management firms to call for improvements in data management to better inform investment decisions and address the needs of regulatory compliance, risk management and client engagement. According to research commissioned by InterSystems, a data technology provider, the top data...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Corporate Actions Europe 2010

The European corporate actions market could be the stage of some pretty heavy duty discussions regarding standards going forward, particularly with regards to the adoption of both XBRL tagging and ISO 20022 messaging. The region’s issuer community, for one, is not going to be easy to convince of the benefits of XBRL tags, given the...