About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Continues Campaign for Regulatory Recognition of Data’s Importance

Subscribe to our newsletter

The EDM Council has been a veritable hive of activity this year with its active campaign to get the regulatory community engaged in the issues surrounding data management. To this end, Mike Atkin, managing director of the industry body, has been on countless visits to US, UK and European legislators to ensure that data standardisation is part of the global regulatory overhaul agenda.

The focus is on getting the “data imperative” to be part of the movement toward regulatory restructure and market oversight, according to Atkin. These endeavours have ranged from actual legislative agreements to merely raising the profile of the data management in the market at large.

One example of a tangible project win has been the work with the fledgling National Institute of Finance (NIF) in the US to establish a new Federal agency for standardisation. The EDM Council has joined forces with a coalition of “interested parties”, known catchily as the Committee to Establish the NIF, to create and pass US legislation designed to implement a data collection and analytic processing centre within the Federal government.

The agency would be charged with the task of: implementing standards (semantic tags, identifiers and classification schemes), building and maintaining reference and legal entity databases, constructing a system-wide transaction and position data warehouse and providing analytical capabilities to help agencies that will become part of the Financial Services Oversight Council monitor risk and oversee systemic vulnerabilities.

The partners have been successful in getting the idea to the table and legislation defining the creation of the “Institute” has been drafted and is now being evaluated by staff supporting the Senate Banking Committee, says Atkin. “The standards objectives (as you would expect) are based on our semantics and entity identification activities. The initiative is gaining traction. The pace is frenetic. No predictions yet, but we are moving closer to having the principles of EDM specified as part of global regulatory reform,” he adds.

The EDM Council has also formally agreed to work with another party, the Software Engineering Institute (SEI) of Carnegie Mellon University, with a view to raising the profile of data. The goal of this strategic partnership is to create an auditable mechanism for modelling data management maturity based on the established Capability Maturity Model Integration (CMMI) methodology. According to Atkin, this model will help financial institutions become more proficient in their management of data and will provide a consistent and comparable benchmark for regulatory authorities in their efforts to mitigate operational risk.

“The model itself will be based on documented best practices and can be used as an operational route map for evaluating the efficiency of data management practices and as a benchmark for evaluating the status of operational integration,” explains Atkin. The EDM Council has already drafted a core document proposing the characteristics of the levels of data maturity and an outline of the primary business components involved in data management, which will soon be available on its website for review.

Of course, as noted in the last issue of Reference Data Review, the EDM Council is also continuing its work with the European Central Bank (ECB) to establish a global reference data utility. This work is aligned with the NIF initiative, says Atkin, and has already resulted in positive discussions with the European Commission, the International Monetary Fund (IMF) and the Bank for International Settlements (BIS).

“This is fundamentally about standards. Once the standards are in place, market authorities like the ECB are in position to implement national law to compel issuers to mark up their issuance documents when they are created (using standard semantics) – and require them to be submitted to a central repository. That way they (and industry participants) have access to consistent and unambiguous data from the source. The ultimate goal is elimination of the multiple levels of data transformation that characterise the way our industry manufactures reference data,” says Atkin.

The EDM Council’s own semantics repository has similar goals and, over recent months, has witnessed the finalisation of its static data component. Atkin is keen for industry participants to start running it through its paces and begin testing it against their internal data environments. The current version covers: common instrument terms, equities, debt, rights, traded options and futures, collective investment vehicles, indices and indicators, OTC derivatives, component terms, dated terms, issuance process terms and global terms.

“We are about to engage in two use cases to map our Semantics Repository to internal data warehouses and test its applicability as a common business vocabulary reference key for internal metadata reconciliation,” adds Atkin. “We have been in active discussions with Swift and have agreed to map our repository to the ISO 20022 data dictionary and to explore its use as both a semantics layer for their dictionary and as an interface to financial messages.”

The industry group has also gained further traction in the market with the addition of six new member firms: CPP Investment Bank, Fannie Mae, the Federal Reserve Bank of New York, InvestTech Systems, Northern Trust and Wipro.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: AI in Asset Management: Buy-Side Attitudes toward GenAI and LLMs

Since ChatGPT exploded onto the scene in late 2022, financial markets participants have been trying to understand the opportunities and risks posed by artificial intelligence and in particular generative AI (GenAI) and large language models (LLMs). While the full value of the technology continues to become apparent, it’s already clear that AI has enormous potential...

BLOG

Simplifying the Data Pipeline Through Automation

The increasing complexity of data processes and the surging use cases for that data has made pipeline automation a must for financial institutions, says Stonebranch vice president for solution management Nils Buer. Not only is there a business case for pipeline automation, but there is also a growing legal case for it as regulators pile...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Trading Regulations Handbook 2022

Welcome to the third edition of A-Team Group’s Trading Regulations Handbook, a publication designed to help you gain a full understanding of regulations that have an impact on your trading operations, data and technology. The handbook provides details of each regulation and its requirements, as well as ‘at-a-glance’ summaries, regulatory timelines and compliance deadlines, and...