About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is the Industry Focusing on the Right Things?

Subscribe to our newsletter

This year’s FIMA demonstrated one clear thing: not everyone agrees on the right way to approach data standardisation across the market.

There was a great deal of debate about whether regulatory compulsion is a desirable concept for the market with regards to forcing market participants to use standards. The European Central Bank’s (ECB) reference data utility proposals, for example, will likely be driven by the regulatory agenda and there was some talk about using the data for the purposes of systemic risk monitoring on the part of the European Systemic Risk Board. As well as the ECB, there were other speakers that definitely came down on the side of regulatory compulsion and those, on the other side of the coin, that favoured a business case approach towards standardisation.

Andre Kelekis, head of global market data for BNP Paribas, expounded the benefits of a regulatory driven approach to the delegation, citing the lack of buy in from senior management as an indicator of the challenge that could be solved by regulatory compulsion. In his eyes, finding a sponsor for data management projects would be much easier if regulators mandated some level of data standardisation across the industry.

The EDM Council has also been active in trying to get data standardisation higher up the regulatory community’s radar over the last 12 months. Mike Atkin, managing director of the data industry group, has been travelling across the globe to meet with senior regulatory bodies such as the US Securities and Exchange Commission (SEC) and the European Commission with this mission in mind. At FIMA, he propounded the benefits of a reference data utility and elaborated on the specific work around developing standards that the EDM Council has been engaged in, such as the semantic repository.

This approach, however, was challenged during a panel debate on the last day of the conference by PJ Di Giammarino, CEO of industry think tank JWG-IT. Di Giammarino argued that this view was too narrow and didn’t take into account the business of banking. He suggested instead that the data community should “back up” and come up with a “proper business case” before progressing down the road to standardisation.

Di Giammarino cautioned that the industry should be wary of “building standards upon a Tower of Babel”, where business drivers are not at the foundation of these standards. He concluded that the issues should be discussed with “people outside of this room”, namely those at the top of financial institutions rather than data managers, in order to get buy in to the data standardisation process. It would seem that some readers also agree with Di Giammarino’s suggestions: one was even inspired to write in with some recommendations to the ECB about how it could improve its utility proposals, taking into account building a better business case.

Rather than imposing “unnecessary” regulation, the reader suggested that the ECB should meet the real requirements of the market. Meanwhile, as the standardisation debate rages on, the industry is seemingly facing a serious challenge in terms of staffing. According to our latest reader poll, 40% of you have experienced a drop in your data management team headcounts this year. A further 60% of you have had to cope with the same number of staff in spite of the increase in volatility and its impact on data volumes. Recessionary pressures have had their toll on headcounts across the industry and data managers are being asked to do considerably more work with fewer resources at hand. As the complexity and volume of data increases, this problem could soon become insurmountable. Does this mean that firms will be investing in new data management systems to cope, or will they wait until their legacy applications (and staff) fall over from the strain?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Encompass Updates Digital Identity Service to Eliminate Stale KYC Data

More than a decade of Know Your Customer (KYC) regulations has left financial institutions with a potential time bomb in their data systems. Outdated and legacy onboarding data has the potential not only to lead to erroneous decision making but also potentially crippling fines from compliance breaches. There are many reasons why KYC data might...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...