About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Plans January Release of Version 1.0 of its Data Management Capability Model

Subscribe to our newsletter

The Enterprise Data Management (EDM) Council will release version 1.0 of its Data Management Capability Model (DCAM) on January 15, 2015. The model defines the scope of capabilities needed to establish, enable and sustain a mature data management discipline, including strategy, organisational structure, technology architecture and operational best practices.

The council’s DCAM model builds on the Data Management Maturity Model that was released to EDM Council members as a draft model in April 2013. This model was initially developed in conjunction with the Software Engineering Institute, part of the US Department of Defense, and then with the Capability Maturity Model Integration (CMMI) Institute. Since the release of the Data Management Maturity Model, the EDM Council and CMMI have gone separate ways, with the CMMI continuing to administer the Data Management Maturity Model and the EDM Council developing DCAM.

In the wake of this change, the EDM Council is mapping the April 2013 draft of the Data Management Maturity Model to DCAM in an effort to help members using the draft as the benchmark for their data management activities to migrate. It says migration should not be difficult as the models use consistent concepts and only the orientation of DCAM has been changed to reflect industry interest in capabilities required for data management, which were not included in the previous model. To ensure support for migration, one of the EDM Council work streams developing DCAM is dedicated to bridging any gaps between the models.

While DCAM adds to other data management models in the market and those developed in-house by finance firms and by consultants, the council says it will not cause confusion, but help firms deal with the realities of data management.

Essentially, the model comprises eight components covering the spectrum of requirements for effective data management. The headline components are: data management strategy, business case and funding model, data management programme, governance management, data architecture, data quality, technology architecture and data operations. Each component includes a short narrative describing its meaning and includes specific capabilities and objectives. Scoring of outcomes is based on three core concepts: degree of engagement, including who is involved and at what level; formalisation of practices, such as documentation and alignment with budget cycles; and level of activity, essentially evidence based output.

Development of the model is being led by John Bottega, senior advisor to the EDM Council and a former chief data officer at the Federal Reserve. He leads regular working sessions with council members and explains: “We have reoriented our work to identify the capabilities that are need to show governance, control and provenance of data. DCAM is not prescriptive, but based on principles, and it is not just a book of best practices, but a working model that includes objectives and measures of their achievement. It also challenges data management pain points such as the complexity of legacy systems, inconsistent definitions and duality.”

The council has pitted DCAM against a number of regulations, including BCBS 239. The model met all the data management requirements of BCBS 239 and could also be a viable framework for firms to fulfil the requirement to assess the capability of their data management processes.

Considering the adoption of the DCAM model, Mike Atkin, managing director of the EDM Council, says: “Since we made a draft of DCAM available in September 2014, it has been downloaded 800 times by our membership of 140 companies.” Looking forward to 2015 and the release of DCAM version 1.0, Bottega notes that the council is talking to sectors other than financial services about using the model and is continuing to work with members to deliver version 1.1 in the second quarter of the year.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

S&P Global Offers GenAI Search Tool for Marketplace

S&P Global has released Marketplace Generative AI, a search tool designed to simplify discovery of data and solutions across S&P Global Marketplace, a platform representing offerings from all five divisions of S&P Global, Sustainable1, Kensho and curated third-party providers. The search tool responds to natural language queries with detailed answers, and proactively recommends other relevant...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

MiFID II handbook, third edition – How compliant are you?

Six months after Markets in Financial Instruments Directive II (MiFID II) went live, how compliant is your organisation? If you took a tactical approach to cross the compliance line on January 3, 2018, how are you reviewing and renewing systems to take a more strategic approach and what are the business benefits of doing so?...