The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

UBS, HSBC, BNY Mellon and RBS All Engaged in a Degree of Labour and Technology Arbitrage

The upshot of many of the discussions at last week’s reference data conference in London, organised by Marcus Evans, was that many firms are currently engaged in a degree of labour and technology arbitrage. Not only are banks negotiating with their vendors around contracts for data feeds and price points for particular solutions, but they are also using new technology for efficiency gains and to be enabled to redeploy staff in key areas, according to speakers from UBS, HSBC, BNY Mellon and RBS.

UBS, for example, is currently engaged in a data vendor strategy review and is using Celoxica’s traditionally front office deployed technology for the processing of reference data in order to reduce its overall expenditure on the data space and increase efficiency, as noted by David Berry, UBS exec in charge of market data sourcing and strategy, last week. “If you have been forced to react to increased complexity in one area, then you need to find a way to bring down costs in other areas,” explained Berry.

There is therefore much potential for arbitrage in the data space by looking at staffing, data provision and technology in more creative ways. “The focus for saving costs should be in areas of highest impact such as big ticket wins by automating US fixed income data processing, for example,” he said.

Berry is a proponent of using “smart technology” in order to rationalise costs, but only after a proper evaluation of data processes has been conducted. Before the bank moved its corporate actions data processing onto the Celoxica solution, it set out a clear migration process in stages and defined key deliverables.

Vendor rationalisation is also a key area of focus, as UBS is using a high number of vendors for its data provision, according to Berry, and spends around 80% of its budget on data from the big two, namely Thomson Reuters and Bloomberg, and 20% on other data feeds. The bank is keen to rationalise these costs and has become more aware of these vendors’ changing charging practices, which have meant they are charging banks for end usage of the data rather than for single licences. “These charging practices are becoming more complex and vendors are charging for usage of the data, which has meant we are paying 12 times for the same data across business lines,” said Berry.

Matthew Cox, head of securities data management for EMEA at BNY Mellon Asset Servicing, added that these costs can be significant in the pricing space because of the need for duplicative data: “We have to use multiple vendors because there is no single vendor that can cover all instrument types and this can increase costs significantly. In the pricing space we have also opted to use a primary and secondary vendor source in order to compare and validate that data. The dual vendor model is fairly typical for daily pricing requirements.”

HSBC is also reviewing its usage of vendor provided data feeds and solutions across its various business lines to the same end. According to Dominique Pujol, senior project manager for EDM at the bank, it is engaged in triple verification for some pricing feeds, which are also subject to different hierarchies, thus complicating the rationalisation process.

Bob Bishop, head of client data management for RBS, on the other hand, indicated that the client data space does not often need to buy in more than one data feed but stressed that the benchmarking of vendor data is important.

BNY Mellon and HSBC are also bringing their staffing costs down via strategic offshoring of various data roles. “The decision to move from 30 people in London to 25 in India in data processing roles means that the cost per transaction falls through the floor,” said Cox.

So, strategic choices such as vendor reviews, offshoring, outsourcing and new technology investments are all on the cards for 2010 in the banking community in one for or another. A careful balancing act will need to be managed between investments made and rationalisations conducted, if banks hope to be able to lower the total cost of ownership of their data management systems overall.

Related content

WEBINAR

Upcoming Webinar: Best practice approaches for corporate actions automation

Date: 18 November 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Demand for timely and accurate corporate actions data is growing as volumes and complexity rise, and financial institutions acknowledge the increasingly costly gap between accurate corporate actions processing in real, or near-real, time and faulty processing caused by poor...

BLOG

ICE Benchmark Administration Adds ICE Risk Free Rate Indexes for US Dollar, Euro and Japanese Yen

ICE Benchmark Administration has introduced ICE Risk Free Rate (RFR) Indexes in US Dollar (SOFR), Euro (€STR) and Japanese Yen (TONA). The indexes will be used to replace LIBOR rates and include ICE SONIA Indexes for GBP Sterling that the company released in April 2021. “Expanding the ICE Indexes to cover GBP, USD, EUR and...

EVENT

ESG Data & Tech Summit 2022

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...