About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Problem with Standards?

Subscribe to our newsletter

Every few years I feel the compulsion to write a rant – known these days as a blog – about the problem with standards, and in particular standards relating to financial information. Usually, this is sparked by some new initiative to get competing vendors to live together peacefully so the marketplace can reap the benefits of streamlined systems communication and reduced opportunity to price gauging.

The target of my angst in this department is recurring: too many standards means no standard at all.

But my recent and ongoing delvings into the world of risk technology for our sister publication – Risk-Technology.net – have unveiled a new (to me) twist to the debate, which may go some way toward explaining why establishing a credible data standard for any aspect of the financial trade lifecycle is akin to pulling teeth.

The reason is quite simple and it’s this: the client community doesn’t want standards.

Banks, brokers, asset managers and other consumers of market, reference and other financial information services, this argument goes, believe they are better qualified to scrub and normalise vendor-supplied data than the suppliers of that data are themselves. And with some justification. Vendors of most types of financial information are not financial markets practitioners, and as such don’t have the downstream processes that would give them the expertise needed to scrub unclean data properly to make it fit for purpose.

Assuming this is, indeed, the case, most practitioners would prefer to receive unclean data from their suppliers, since they understand the processes involved in making that data fit for purpose and – crucially – see those processes as a competitive advantage.

In other words, these practitioners reckon they can clean the third-party data they need better than anyone else, and would prefer to keep this perceived advantage in place.

Now, detractors will rapidly point out – so, don’t feel you need to – that given the huge amount of resources they ‘waste’ on scrubbing external data sources, market practitioners will welcome any standards initiative aimed at streamlining their processes. Clearly, that will be the case for many, if not most financial institutions.

But this data normalisation as competitive advantage argument has some credence. Increasingly, applications providers are including a kind of data normalisation toolkit as an adjunct to their core system, perhaps in recognition of the fact that client organisations want more control over the way they normalise data before it is consumed by key applications.

Certainly, garbage in, garbage out is a widely heard mantra in the risk community. With so many initiatives in data standards ongoing right now, perhaps the question is not whether there are too many standards but rather that they are just not wanted.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best Practices for Building High-Performance Data Infrastructures

The requirement for high-performance data systems to support trading analytics for hedge funds, high-frequency trading firms and electronic liquidity providers is well established. But the explosion in Big Data over the past several years has expanded the scope of inputs being used by these firms. At the same time, cloud technologies have added complexity to...

BLOG

Complex Sanctions Environment Demands Powerful Screening Monitors: SIX Report

Sanctions screening technology has never been more important for financial institutions as new geopolitical and economic threats create the riskiest trading environment in recent history. That is the key finding of a new report, that highlights the need for greater resilience among organisations to the raised threat level faced by the global financial system. In...

EVENT

Eagle Alpha Alternative Data Conference, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...