A-Team Insight Blogs

ECB’s Francis Gross Makes the Case for Data Standardisation and Shared Data Infrastructure

Share article

Technology has added complexity to capital markets, shifted the human-machine interface,  and will soon provide automation beyond human understanding. It could also go wrong and cause an unplanned and unwanted crisis on a scale far larger and more damaging than the 2008 crisis. Countering this criticality and ‘the data mess that is getting worse all the time’, Francis Gross, senior adviser to the directorate of general statistics at the European Central Bank, set out a vision of finance as a global network of standardised contracts among a global population of agents at last week’s TSAM conference.

Noting the need to address data management challenges and invest in data, having opened a pandora’s box that can’t be closed, Gross outlined steps that could be taken to achieve his vision. He talked about the need to move from operational to analytical systems, to be able to measure the speed of global systems in real time and to develop flexible analytics to address sudden surprises in global markets. He also underlined the importance of extremely granular data and the need to standardise data globally, starting with identifiers such as the Legal Entity Identifier (LEI). Additionally, every contract should be represented in a universal language.

Taking a step back, Gross described data today as an obstacle to technology, and technology as a catalyst of increasing social complexity caused by connecting more diverse people in different countries. He noted that the ‘data mess’ is getting worse all the time and that technology is adding to complexity rather than complementing financial markets.

He commented: “The problem is deep, global, growing fast and potentially critical. The implementation of standards has become urgent, but harder and slower to reach on a global basis.”

While Gross called for global standards as a means to solve problems caused by data and technology, he questioned who would make the standards. He cited the LEI as a working standard made at a global level, but said the issuance of 1.3 million LEIs is not good enough. He commented: “We need public leadership to make global standards and an infrastructure that holds data that is used by all.”

Fleshing out his vision of a network of global contracts, Gross said standards will need to be agreed by law and that law should mandate standardised digital contract representation. Distributed ledgers would also be needed to represent populations of diverse contracts in a single language.

With global standardisation in place, Gross described the possibilities of industry participants and regulators working with a shared data infrastructure, banks having single identifiers for all objects, which would lead to safer operations, and machine executed reporting. He concluded: “Building data infrastructure underpinned by law has to be a public mission. It will create more freedom for markets, lower the costs and risks of our industry, and automate reporting.”

One Reply to “ECB’s Francis Gross Makes the Case for Data Standardisation and Shared Data Infrastructure”

  1. Check out DataChemist and their use case in this space as they have confronted the lack of consistency of data standards and would have preferred to be starting from a more level playing field of data.

Leave a comment

Your email address will not be published. Required fields are marked *

*

Related content

WEBINAR

Recorded Webinar: Data lineage – how to ensure you can deliver the right information, to the right people, at the right time

Data lineage is critical to digital transformation, business decisions and regulatory compliance. It is also difficult to implement at scale, not only because large quantities of data across numerous systems must be inventoried and tracked, but also because the data is not static and needs context to make sense to the business. If you are...

BLOG

Knowledge Graphs – the Future of Data Management?

Knowledge graphs are becoming an increasingly popular way of thinking about and organising data within financial services firms. The industry is turning to knowledge graphs as a methodology for making data more accessible, and for use in artificial intelligence (AI) solutions, for example. Edgar Zalite, global head of metadata management within the chief data and...

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...