About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Problem with Standards?

Subscribe to our newsletter

Every few years I feel the compulsion to write a rant – known these days as a blog – about the problem with standards, and in particular standards relating to financial information. Usually, this is sparked by some new initiative to get competing vendors to live together peacefully so the marketplace can reap the benefits of streamlined systems communication and reduced opportunity to price gauging.

The target of my angst in this department is recurring: too many standards means no standard at all.

But my recent and ongoing delvings into the world of risk technology for our sister publication – Risk-Technology.net – have unveiled a new (to me) twist to the debate, which may go some way toward explaining why establishing a credible data standard for any aspect of the financial trade lifecycle is akin to pulling teeth.

The reason is quite simple and it’s this: the client community doesn’t want standards.

Banks, brokers, asset managers and other consumers of market, reference and other financial information services, this argument goes, believe they are better qualified to scrub and normalise vendor-supplied data than the suppliers of that data are themselves. And with some justification. Vendors of most types of financial information are not financial markets practitioners, and as such don’t have the downstream processes that would give them the expertise needed to scrub unclean data properly to make it fit for purpose.

Assuming this is, indeed, the case, most practitioners would prefer to receive unclean data from their suppliers, since they understand the processes involved in making that data fit for purpose and – crucially – see those processes as a competitive advantage.

In other words, these practitioners reckon they can clean the third-party data they need better than anyone else, and would prefer to keep this perceived advantage in place.

Now, detractors will rapidly point out – so, don’t feel you need to – that given the huge amount of resources they ‘waste’ on scrubbing external data sources, market practitioners will welcome any standards initiative aimed at streamlining their processes. Clearly, that will be the case for many, if not most financial institutions.

But this data normalisation as competitive advantage argument has some credence. Increasingly, applications providers are including a kind of data normalisation toolkit as an adjunct to their core system, perhaps in recognition of the fact that client organisations want more control over the way they normalise data before it is consumed by key applications.

Certainly, garbage in, garbage out is a widely heard mantra in the risk community. With so many initiatives in data standards ongoing right now, perhaps the question is not whether there are too many standards but rather that they are just not wanted.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Approaches to ESG data for analytics

Volumes of ESG data are huge and continue to grow, questioning how financial institutions with a focus on ESG investing can continuously capture and contain required data sets, master and integrate the data, and ensure data quality for meaningful analytics. This webinar will consider approaches to ESG data and data management for analytics, the challenges...

BLOG

Fenergo Enhances Financial Crime Compliance Capabilities with Agentic AI Integration

Fenergo has introduced an updated financial crime solution – the FinCrime Operating System (FinCrime OS) – featuring a new agentic AI layer aimed at significantly improving operational efficiency within financial institutions. This development comes against a background of spiralling operational costs and rising compliance demands enhanced by geopolitical tension and regulatory flux. Marc Murphy, CEO,...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...