About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Problem with Standards?

Subscribe to our newsletter

Every few years I feel the compulsion to write a rant – known these days as a blog – about the problem with standards, and in particular standards relating to financial information. Usually, this is sparked by some new initiative to get competing vendors to live together peacefully so the marketplace can reap the benefits of streamlined systems communication and reduced opportunity to price gauging.

The target of my angst in this department is recurring: too many standards means no standard at all.

But my recent and ongoing delvings into the world of risk technology for our sister publication – Risk-Technology.net – have unveiled a new (to me) twist to the debate, which may go some way toward explaining why establishing a credible data standard for any aspect of the financial trade lifecycle is akin to pulling teeth.

The reason is quite simple and it’s this: the client community doesn’t want standards.

Banks, brokers, asset managers and other consumers of market, reference and other financial information services, this argument goes, believe they are better qualified to scrub and normalise vendor-supplied data than the suppliers of that data are themselves. And with some justification. Vendors of most types of financial information are not financial markets practitioners, and as such don’t have the downstream processes that would give them the expertise needed to scrub unclean data properly to make it fit for purpose.

Assuming this is, indeed, the case, most practitioners would prefer to receive unclean data from their suppliers, since they understand the processes involved in making that data fit for purpose and – crucially – see those processes as a competitive advantage.

In other words, these practitioners reckon they can clean the third-party data they need better than anyone else, and would prefer to keep this perceived advantage in place.

Now, detractors will rapidly point out – so, don’t feel you need to – that given the huge amount of resources they ‘waste’ on scrubbing external data sources, market practitioners will welcome any standards initiative aimed at streamlining their processes. Clearly, that will be the case for many, if not most financial institutions.

But this data normalisation as competitive advantage argument has some credence. Increasingly, applications providers are including a kind of data normalisation toolkit as an adjunct to their core system, perhaps in recognition of the fact that client organisations want more control over the way they normalise data before it is consumed by key applications.

Certainly, garbage in, garbage out is a widely heard mantra in the risk community. With so many initiatives in data standards ongoing right now, perhaps the question is not whether there are too many standards but rather that they are just not wanted.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Proactive RegTech approaches to fighting financial crime

Financial crime is a global problem that costs the economy trillions of dollars a year, despite best efforts by financial services firms, regulators, and governments to stem the flow. As criminals become more sophisticated in how they commit financial crime, so too must capital markets participants working to challenge criminality and secure the global financial...

BLOG

Confluence Partners Manaos to Provide SFDR Reporting Solution

Confluence Technologies, a solutions provider helping investment managers solve complex data challenges, has partnered Manaos, the investment data management platform designed and incubated by the securities services business of BNP Paribas, to provide a reporting solution for Sustainable Finance Disclosures Regulation (SFDR). The partnership allows Confluence to provide an end-to-end SFDR solution to asset managers...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...