About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Virginie’s Blog – Making Threats and Opportunities

Subscribe to our newsletter

This week’s FIMA conference in London as usual saw industry practitioners discussing their recurring bugbears – IP rights within the vendor community, a lack of budget for more strategic data management projects (outside of those tied to compliance) and coping with the cost cutting exercises sweeping the financial services community – but it also involved a number of very interesting ideas being aired with regards to potential collaborative approaches in the data management space.

One such idea was raised during what I jokingly refer to as the annual vendor bashing panel, where participants elaborate on their gripes about service levels and charging practices within the vendor community. IP rights and end user licenses specifically have been a contentious issue for some time and industry groups such as the Information Providers User Group (IPUG) have been particularly active in championing the views of the practitioner community. IPUG’s David Berry was on the FIMA panel discussing these very matters and noted that an increasing trend within the user community has been for firms to practice data sources arbitration in order to push back on market data vendors and their, sometimes aggressive, approach to licensing.

Fellow panellist and UBS colleague Deborah McAdams indicated that the industry might be compelled to go further in future, if vendors fail to heed these warnings. She said that a group of banks could band together to directly compete with vendors in providing data feeds. After all, they are often being sold the data they have provided to the community in the first place, albeit in a cleansed, validated and packaged form. These latter steps could therefore be tackled by a utility approach, where banks invest in new infrastructure to take the vendors out of the equation. However, panellists did indicate that such a move is currently viewed as a last resort due to the potential cost of setting up such an infrastructure.

Of course, data utilities and their ilk continue to be a big discussion point within the industry at large, given the development of new trade repositories and the discussions about the Office of Financial Research (OFR) and legal entity identification. Much like the last three years, the OFR and the LEI therefore continued to dominate the panel sessions and chatter within the exhibition halls. This year, however, it seemed that many more people were optimistic about the developments.

The European Central Bank’s stalwart campaigner for a reference data utility Francis Gross was therefore not alone in his positivity about the potential progress that can be achieved in the next year or so. His and others’ hopes reside with the US Commodity Futures Trading Commission’s (CFTC) efforts in the swaps space at the moment, as it is to this that the initial LEI work has been tied. So much was said about this initial phase; with rather less detail about how it could be extended to the rest of the markets. Perhaps next FIMA will see more elaboration on this topic?

In the meantime, no doubt, FIMA delegates and speakers alike will be intently watching the US regulatory arena over the coming months ahead of the Q1 deadline for this work to be kicked off. Threats and opportunities are there for vendors and practitioners to deal with accordingly, if everything turns out as hoped.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...