About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Continues Campaign for Regulatory Recognition of Data’s Importance

Subscribe to our newsletter

The EDM Council has been a veritable hive of activity this year with its active campaign to get the regulatory community engaged in the issues surrounding data management. To this end, Mike Atkin, managing director of the industry body, has been on countless visits to US, UK and European legislators to ensure that data standardisation is part of the global regulatory overhaul agenda.

The focus is on getting the “data imperative” to be part of the movement toward regulatory restructure and market oversight, according to Atkin. These endeavours have ranged from actual legislative agreements to merely raising the profile of the data management in the market at large.

One example of a tangible project win has been the work with the fledgling National Institute of Finance (NIF) in the US to establish a new Federal agency for standardisation. The EDM Council has joined forces with a coalition of “interested parties”, known catchily as the Committee to Establish the NIF, to create and pass US legislation designed to implement a data collection and analytic processing centre within the Federal government.

The agency would be charged with the task of: implementing standards (semantic tags, identifiers and classification schemes), building and maintaining reference and legal entity databases, constructing a system-wide transaction and position data warehouse and providing analytical capabilities to help agencies that will become part of the Financial Services Oversight Council monitor risk and oversee systemic vulnerabilities.

The partners have been successful in getting the idea to the table and legislation defining the creation of the “Institute” has been drafted and is now being evaluated by staff supporting the Senate Banking Committee, says Atkin. “The standards objectives (as you would expect) are based on our semantics and entity identification activities. The initiative is gaining traction. The pace is frenetic. No predictions yet, but we are moving closer to having the principles of EDM specified as part of global regulatory reform,” he adds.

The EDM Council has also formally agreed to work with another party, the Software Engineering Institute (SEI) of Carnegie Mellon University, with a view to raising the profile of data. The goal of this strategic partnership is to create an auditable mechanism for modelling data management maturity based on the established Capability Maturity Model Integration (CMMI) methodology. According to Atkin, this model will help financial institutions become more proficient in their management of data and will provide a consistent and comparable benchmark for regulatory authorities in their efforts to mitigate operational risk.

“The model itself will be based on documented best practices and can be used as an operational route map for evaluating the efficiency of data management practices and as a benchmark for evaluating the status of operational integration,” explains Atkin. The EDM Council has already drafted a core document proposing the characteristics of the levels of data maturity and an outline of the primary business components involved in data management, which will soon be available on its website for review.

Of course, as noted in the last issue of Reference Data Review, the EDM Council is also continuing its work with the European Central Bank (ECB) to establish a global reference data utility. This work is aligned with the NIF initiative, says Atkin, and has already resulted in positive discussions with the European Commission, the International Monetary Fund (IMF) and the Bank for International Settlements (BIS).

“This is fundamentally about standards. Once the standards are in place, market authorities like the ECB are in position to implement national law to compel issuers to mark up their issuance documents when they are created (using standard semantics) – and require them to be submitted to a central repository. That way they (and industry participants) have access to consistent and unambiguous data from the source. The ultimate goal is elimination of the multiple levels of data transformation that characterise the way our industry manufactures reference data,” says Atkin.

The EDM Council’s own semantics repository has similar goals and, over recent months, has witnessed the finalisation of its static data component. Atkin is keen for industry participants to start running it through its paces and begin testing it against their internal data environments. The current version covers: common instrument terms, equities, debt, rights, traded options and futures, collective investment vehicles, indices and indicators, OTC derivatives, component terms, dated terms, issuance process terms and global terms.

“We are about to engage in two use cases to map our Semantics Repository to internal data warehouses and test its applicability as a common business vocabulary reference key for internal metadata reconciliation,” adds Atkin. “We have been in active discussions with Swift and have agreed to map our repository to the ISO 20022 data dictionary and to explore its use as both a semantics layer for their dictionary and as an interface to financial messages.”

The industry group has also gained further traction in the market with the addition of six new member firms: CPP Investment Bank, Fannie Mae, the Federal Reserve Bank of New York, InvestTech Systems, Northern Trust and Wipro.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

S&P Global to Acquire Data and Analytical Solutions Specialist Visible Alpha

S&P Global has reached an agreement to acquire Visible Alpha, a provider of industry and segment consensus data, sell-side analyst models and analytics from high-quality, exclusive sources. The acquisition will create a premium offering of fundamental investment research capabilities on S&P Global Market Intelligence’s Capital IQ Pro research and analysis platform. Visible Alpha provides consensus...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...