About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is the Industry Focusing on the Right Things?

Subscribe to our newsletter

This year’s FIMA demonstrated one clear thing: not everyone agrees on the right way to approach data standardisation across the market.

There was a great deal of debate about whether regulatory compulsion is a desirable concept for the market with regards to forcing market participants to use standards. The European Central Bank’s (ECB) reference data utility proposals, for example, will likely be driven by the regulatory agenda and there was some talk about using the data for the purposes of systemic risk monitoring on the part of the European Systemic Risk Board. As well as the ECB, there were other speakers that definitely came down on the side of regulatory compulsion and those, on the other side of the coin, that favoured a business case approach towards standardisation.

Andre Kelekis, head of global market data for BNP Paribas, expounded the benefits of a regulatory driven approach to the delegation, citing the lack of buy in from senior management as an indicator of the challenge that could be solved by regulatory compulsion. In his eyes, finding a sponsor for data management projects would be much easier if regulators mandated some level of data standardisation across the industry.

The EDM Council has also been active in trying to get data standardisation higher up the regulatory community’s radar over the last 12 months. Mike Atkin, managing director of the data industry group, has been travelling across the globe to meet with senior regulatory bodies such as the US Securities and Exchange Commission (SEC) and the European Commission with this mission in mind. At FIMA, he propounded the benefits of a reference data utility and elaborated on the specific work around developing standards that the EDM Council has been engaged in, such as the semantic repository.

This approach, however, was challenged during a panel debate on the last day of the conference by PJ Di Giammarino, CEO of industry think tank JWG-IT. Di Giammarino argued that this view was too narrow and didn’t take into account the business of banking. He suggested instead that the data community should “back up” and come up with a “proper business case” before progressing down the road to standardisation.

Di Giammarino cautioned that the industry should be wary of “building standards upon a Tower of Babel”, where business drivers are not at the foundation of these standards. He concluded that the issues should be discussed with “people outside of this room”, namely those at the top of financial institutions rather than data managers, in order to get buy in to the data standardisation process. It would seem that some readers also agree with Di Giammarino’s suggestions: one was even inspired to write in with some recommendations to the ECB about how it could improve its utility proposals, taking into account building a better business case.

Rather than imposing “unnecessary” regulation, the reader suggested that the ECB should meet the real requirements of the market. Meanwhile, as the standardisation debate rages on, the industry is seemingly facing a serious challenge in terms of staffing. According to our latest reader poll, 40% of you have experienced a drop in your data management team headcounts this year. A further 60% of you have had to cope with the same number of staff in spite of the increase in volatility and its impact on data volumes. Recessionary pressures have had their toll on headcounts across the industry and data managers are being asked to do considerably more work with fewer resources at hand. As the complexity and volume of data increases, this problem could soon become insurmountable. Does this mean that firms will be investing in new data management systems to cope, or will they wait until their legacy applications (and staff) fall over from the strain?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

ESG Data Tops Executives’ 2025 Shopping Lists

Senior executives at financial institutions expect to direct the biggest boost in their data expenditure plans over the coming year towards ESG information, according to a survey that also found that high-quality data and analytics in all domains is being prioritised for growth. In its third annual Future of Finance survey, Switzerland-based exchange operator SIX also found...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...