About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

UBS, HSBC, BNY Mellon and RBS All Engaged in a Degree of Labour and Technology Arbitrage

Subscribe to our newsletter

The upshot of many of the discussions at last week’s reference data conference in London, organised by Marcus Evans, was that many firms are currently engaged in a degree of labour and technology arbitrage. Not only are banks negotiating with their vendors around contracts for data feeds and price points for particular solutions, but they are also using new technology for efficiency gains and to be enabled to redeploy staff in key areas, according to speakers from UBS, HSBC, BNY Mellon and RBS.

UBS, for example, is currently engaged in a data vendor strategy review and is using Celoxica’s traditionally front office deployed technology for the processing of reference data in order to reduce its overall expenditure on the data space and increase efficiency, as noted by David Berry, UBS exec in charge of market data sourcing and strategy, last week. “If you have been forced to react to increased complexity in one area, then you need to find a way to bring down costs in other areas,” explained Berry.

There is therefore much potential for arbitrage in the data space by looking at staffing, data provision and technology in more creative ways. “The focus for saving costs should be in areas of highest impact such as big ticket wins by automating US fixed income data processing, for example,” he said.

Berry is a proponent of using “smart technology” in order to rationalise costs, but only after a proper evaluation of data processes has been conducted. Before the bank moved its corporate actions data processing onto the Celoxica solution, it set out a clear migration process in stages and defined key deliverables.

Vendor rationalisation is also a key area of focus, as UBS is using a high number of vendors for its data provision, according to Berry, and spends around 80% of its budget on data from the big two, namely Thomson Reuters and Bloomberg, and 20% on other data feeds. The bank is keen to rationalise these costs and has become more aware of these vendors’ changing charging practices, which have meant they are charging banks for end usage of the data rather than for single licences. “These charging practices are becoming more complex and vendors are charging for usage of the data, which has meant we are paying 12 times for the same data across business lines,” said Berry.

Matthew Cox, head of securities data management for EMEA at BNY Mellon Asset Servicing, added that these costs can be significant in the pricing space because of the need for duplicative data: “We have to use multiple vendors because there is no single vendor that can cover all instrument types and this can increase costs significantly. In the pricing space we have also opted to use a primary and secondary vendor source in order to compare and validate that data. The dual vendor model is fairly typical for daily pricing requirements.”

HSBC is also reviewing its usage of vendor provided data feeds and solutions across its various business lines to the same end. According to Dominique Pujol, senior project manager for EDM at the bank, it is engaged in triple verification for some pricing feeds, which are also subject to different hierarchies, thus complicating the rationalisation process.

Bob Bishop, head of client data management for RBS, on the other hand, indicated that the client data space does not often need to buy in more than one data feed but stressed that the benchmarking of vendor data is important.

BNY Mellon and HSBC are also bringing their staffing costs down via strategic offshoring of various data roles. “The decision to move from 30 people in London to 25 in India in data processing roles means that the cost per transaction falls through the floor,” said Cox.

So, strategic choices such as vendor reviews, offshoring, outsourcing and new technology investments are all on the cards for 2010 in the banking community in one for or another. A careful balancing act will need to be managed between investments made and rationalisations conducted, if banks hope to be able to lower the total cost of ownership of their data management systems overall.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Mastering Data Lineage for Risk, Compliance, and AI Governance

Financial institutions are under increasing pressure to ensure data transparency, regulatory compliance, and AI governance. Yet many struggle with fragmented data landscapes, poor lineage tracking and compliance gaps. This webinar will explore how enterprise-grade data lineage can help capital markets participants ensure regulatory compliance with obligations such as BCBS 239, CCAR, IFRS 9, SEC requirements...

BLOG

Data Convergence Initiative Seeks to Bring ESG Clarity to Private Markets

Private market data tools have become a fixture of the data management industry, as evidenced by year-ahead predictions from experts who routinely put it near the top of their lists of anticipated developments in 2025. A slew of new products have also been launched to cater to the growing demand for information from general partners (GPs),...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...