About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Problem with Standards?

Subscribe to our newsletter

Every few years I feel the compulsion to write a rant – known these days as a blog – about the problem with standards, and in particular standards relating to financial information. Usually, this is sparked by some new initiative to get competing vendors to live together peacefully so the marketplace can reap the benefits of streamlined systems communication and reduced opportunity to price gauging.

The target of my angst in this department is recurring: too many standards means no standard at all.

But my recent and ongoing delvings into the world of risk technology for our sister publication – Risk-Technology.net – have unveiled a new (to me) twist to the debate, which may go some way toward explaining why establishing a credible data standard for any aspect of the financial trade lifecycle is akin to pulling teeth.

The reason is quite simple and it’s this: the client community doesn’t want standards.

Banks, brokers, asset managers and other consumers of market, reference and other financial information services, this argument goes, believe they are better qualified to scrub and normalise vendor-supplied data than the suppliers of that data are themselves. And with some justification. Vendors of most types of financial information are not financial markets practitioners, and as such don’t have the downstream processes that would give them the expertise needed to scrub unclean data properly to make it fit for purpose.

Assuming this is, indeed, the case, most practitioners would prefer to receive unclean data from their suppliers, since they understand the processes involved in making that data fit for purpose and – crucially – see those processes as a competitive advantage.

In other words, these practitioners reckon they can clean the third-party data they need better than anyone else, and would prefer to keep this perceived advantage in place.

Now, detractors will rapidly point out – so, don’t feel you need to – that given the huge amount of resources they ‘waste’ on scrubbing external data sources, market practitioners will welcome any standards initiative aimed at streamlining their processes. Clearly, that will be the case for many, if not most financial institutions.

But this data normalisation as competitive advantage argument has some credence. Increasingly, applications providers are including a kind of data normalisation toolkit as an adjunct to their core system, perhaps in recognition of the fact that client organisations want more control over the way they normalise data before it is consumed by key applications.

Certainly, garbage in, garbage out is a widely heard mantra in the risk community. With so many initiatives in data standards ongoing right now, perhaps the question is not whether there are too many standards but rather that they are just not wanted.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Approaches to data quality

Underpinning all data management initiatives is the fundamental need to get data quality right. Poor data quality can be costly, impact customer service, lead to errors in risk management and regulatory reporting, and more. So, how can you improve data quality? How can you use rules, standardisation and technology to make improvements? And how is...

BLOG

FCA Takes Charge: UK Centralises AML Supervision Across Professional Services

The United Kingdom’s decision to centralise Anti-Money Laundering (AML) and Counter-Terrorism Financing (CTF) supervision under the Financial Conduct Authority (FCA) marks a structural shift that brings professional services oversight in line with the rest of the financial sector. The move aligns the UK with a broader global trend toward consolidation, consistency, and intelligence-led supervision –...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...