About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is the Industry Focusing on the Right Things?

Subscribe to our newsletter

This year’s FIMA demonstrated one clear thing: not everyone agrees on the right way to approach data standardisation across the market.

There was a great deal of debate about whether regulatory compulsion is a desirable concept for the market with regards to forcing market participants to use standards. The European Central Bank’s (ECB) reference data utility proposals, for example, will likely be driven by the regulatory agenda and there was some talk about using the data for the purposes of systemic risk monitoring on the part of the European Systemic Risk Board. As well as the ECB, there were other speakers that definitely came down on the side of regulatory compulsion and those, on the other side of the coin, that favoured a business case approach towards standardisation.

Andre Kelekis, head of global market data for BNP Paribas, expounded the benefits of a regulatory driven approach to the delegation, citing the lack of buy in from senior management as an indicator of the challenge that could be solved by regulatory compulsion. In his eyes, finding a sponsor for data management projects would be much easier if regulators mandated some level of data standardisation across the industry.

The EDM Council has also been active in trying to get data standardisation higher up the regulatory community’s radar over the last 12 months. Mike Atkin, managing director of the data industry group, has been travelling across the globe to meet with senior regulatory bodies such as the US Securities and Exchange Commission (SEC) and the European Commission with this mission in mind. At FIMA, he propounded the benefits of a reference data utility and elaborated on the specific work around developing standards that the EDM Council has been engaged in, such as the semantic repository.

This approach, however, was challenged during a panel debate on the last day of the conference by PJ Di Giammarino, CEO of industry think tank JWG-IT. Di Giammarino argued that this view was too narrow and didn’t take into account the business of banking. He suggested instead that the data community should “back up” and come up with a “proper business case” before progressing down the road to standardisation.

Di Giammarino cautioned that the industry should be wary of “building standards upon a Tower of Babel”, where business drivers are not at the foundation of these standards. He concluded that the issues should be discussed with “people outside of this room”, namely those at the top of financial institutions rather than data managers, in order to get buy in to the data standardisation process. It would seem that some readers also agree with Di Giammarino’s suggestions: one was even inspired to write in with some recommendations to the ECB about how it could improve its utility proposals, taking into account building a better business case.

Rather than imposing “unnecessary” regulation, the reader suggested that the ECB should meet the real requirements of the market. Meanwhile, as the standardisation debate rages on, the industry is seemingly facing a serious challenge in terms of staffing. According to our latest reader poll, 40% of you have experienced a drop in your data management team headcounts this year. A further 60% of you have had to cope with the same number of staff in spite of the increase in volatility and its impact on data volumes. Recessionary pressures have had their toll on headcounts across the industry and data managers are being asked to do considerably more work with fewer resources at hand. As the complexity and volume of data increases, this problem could soon become insurmountable. Does this mean that firms will be investing in new data management systems to cope, or will they wait until their legacy applications (and staff) fall over from the strain?

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to simplify and modernize data architecture to unleash data value and innovation

15 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines....

BLOG

Data Quality Posing Obstacles to AI Adoption and Other Processes, say Reports

The rush to build artificial intelligence applications has hit a wall of poor quality data and data complexity that’s hindering them from taking advantage of the technology. Those barriers are also preventing firms from upgrading other parts of their tech stacks. A slew of surveys and comments by researchers and vendors paint a picture of...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...