About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Problem with Standards?

Subscribe to our newsletter

Every few years I feel the compulsion to write a rant – known these days as a blog – about the problem with standards, and in particular standards relating to financial information. Usually, this is sparked by some new initiative to get competing vendors to live together peacefully so the marketplace can reap the benefits of streamlined systems communication and reduced opportunity to price gauging.

The target of my angst in this department is recurring: too many standards means no standard at all.

But my recent and ongoing delvings into the world of risk technology for our sister publication – Risk-Technology.net – have unveiled a new (to me) twist to the debate, which may go some way toward explaining why establishing a credible data standard for any aspect of the financial trade lifecycle is akin to pulling teeth.

The reason is quite simple and it’s this: the client community doesn’t want standards.

Banks, brokers, asset managers and other consumers of market, reference and other financial information services, this argument goes, believe they are better qualified to scrub and normalise vendor-supplied data than the suppliers of that data are themselves. And with some justification. Vendors of most types of financial information are not financial markets practitioners, and as such don’t have the downstream processes that would give them the expertise needed to scrub unclean data properly to make it fit for purpose.

Assuming this is, indeed, the case, most practitioners would prefer to receive unclean data from their suppliers, since they understand the processes involved in making that data fit for purpose and – crucially – see those processes as a competitive advantage.

In other words, these practitioners reckon they can clean the third-party data they need better than anyone else, and would prefer to keep this perceived advantage in place.

Now, detractors will rapidly point out – so, don’t feel you need to – that given the huge amount of resources they ‘waste’ on scrubbing external data sources, market practitioners will welcome any standards initiative aimed at streamlining their processes. Clearly, that will be the case for many, if not most financial institutions.

But this data normalisation as competitive advantage argument has some credence. Increasingly, applications providers are including a kind of data normalisation toolkit as an adjunct to their core system, perhaps in recognition of the fact that client organisations want more control over the way they normalise data before it is consumed by key applications.

Certainly, garbage in, garbage out is a widely heard mantra in the risk community. With so many initiatives in data standards ongoing right now, perhaps the question is not whether there are too many standards but rather that they are just not wanted.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enterprise Solutions – Entity Remediation

This webinar has passed, but you can view the recording here. Devesh Shukla, Global Head of Reference Data Product Development, Bloomberg and Peter Warms, Head of Product Development for Global Data and Symbology, Bloomberg discuss entity remediation. Webinar Date: February 28, 2014 Speakers: Sponsors:  

BLOG

Don’t Forget People and Process when Deploying Agentic AI

When the financial industry talks ‘agentic AI’, there’s a tendency for the conversation to quickly devolve into cutting-edge technologies – large language models (LLMs), neural networks, generative algorithms (GenAI) etc. Agentic AI is really about transforming the business processes that define firms’ operations and the roles that supervise them. Success is dependent on more than...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...