Financial institutions’ absorption of ever-greater volumes of data, and their utilisation of it in a surging number of use cases, is putting strains on their data management processes.
Taking the friction out of those workflows can improve performance substantially. But the absence of a unified international set of standards to ensure all data used by organisations is clean, accurate and able to be used by disparate systems is mitigating against that and threatening to create “data chaos”.In A-Team Group’s next Data Management Insight webinar, a panel of industry experts will look at how those hurdles can be overcome while highlighting the importance of getting such projects right. The webinar, entitled “Streamlining trading and investment processes with data standards and identifiers” will be held on 3 June.
“Data chaos isn’t just a nuisance—it’s a risk,” Adriana Ennab, Executive in Residence at Global Digital Finance, a non-profit that promotes the creation of standards in digital assets markets, told Data Management Insight.
“In today’s capital markets, data flows faster and further than ever before. But without strong standards and identifiers, much of that data ends up fragmented, duplicated, or even unusable. Data without structure is noise—and in regulated markets, noise can be costly.”
More Data, More Challenges
The growing importance of data to financial institutions can be gauged by the amount of money they have spent on it in the past decades. Annual spend on market data alone grew 6.4 per cent year-on-year in 2024 to US$44.3 billion, according subscription-spend tracker TRG Screen. Spending on data infrastructure climbed to almost $80b in the same year, up from $12bn in 2018.
The enormous increase in the amount of data ingested into institutions’ systems has come at a cost beyond that of the buying it. It has amplified the challenges of data management in companies that still oversee fragmented architectures; it has required greater vigilance in security and governance protocols; and, it has caused clogged data pipelines that have slowed data-led processes and decision-making.
Standards and identifiers help reduce these pain points by ensuring that the data is more manageable and exchangeable, taking away the inconsistencies that slow data-dependent processes. Common standards bring uniformity to data structure and identifiers harmonise the way users and systems recognise assets and entities and the data associated with them.
Combined, standards and identifiers also make it easier for analytics packages, including those built on artificial intelligence, to interrogate data and derive more accurate outputs and forecasts.
“The amount of data that financial services firms are engaging with throughout their processes is growing exponentially, therefore the need for data standards and identifiers is growing alongside this,” Laura Stanley Director of Entity Data and Symbology at LSEG told Data Management Insight.
Benefits Abound
Stephan Dreyer, Managing Director at ANNA, the membership organisation of national numbering agencies, said the benefits of standardisation are numerous.
“Consistent data and identification standards reduce friction, cut costs and enhance transparency across financial markets,” Dreyer told Data Management Insight. “They lay the groundwork for greater efficiency, better risk management and more confident decision-making.”
Data consistency brings order to processes and in capital markets, where billions of dollars of trade in assets is at stake everyday, getting that data right is crucial. Already a handful of standards exist. Trading venues and regulators, for instance, have their own sets of data reporting standards, as do government agencies and companies’ internal data governance policies. However, there are few that are applied globally, meaning that institutions must knit together a patchwork quilt of standards if they wish to operate internationally.
The same can be said of identifiers, which are crucial for the matching of data to companies, people and assets. The Global Legal Entity Identifier Foundation has created the closest thing to a worldwide identification system for mostly listed companies. Dun and Bradstreet’s DUNS Numbers are similarly used for private companies. However, while GLEF’s LEIs are issued at no cost to the entity identified, DUNS Numbers must be bought.
To achieve interoperability between the two requires an added layer of data processes, which costs money to implement and maintain.
Expanding Need
ANNA’s Dreyer said that the increasing use of artificial intelligence and other new data processes means the need for standardised data and identities will become greater over time
“In the coming years, continued advancements in digital assets, AI and global regulatory alignment will drive the evolution of data standards and identifiers, offering significant benefits in efficiency, transparency, and security,” he said. “Collaboration across industry players will be crucial to creating a cohesive, future-proof data ecosystem for the capital markets.”
A-Team Group’s “Streamlining trading and investment processes with data standards and identifiers” will bring together Stanley, Ennab and Linda Powell, Deputy Chief Data Officer at BNY, for a panel discussion moderated by Dreyer. The webinar will be held at 10:00am ET, 3:00pm London, 4:00pm CET. Click here to register.
Subscribe to our newsletter