The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

UBS Using Front Office Technology to Deal With Reference Data Processing, Says Berry

In a move that confirms that the worlds of the front office and the back office are moving ever closer, UBS’ executive in charge of market data sourcing and strategy, David Berry has indicated that the bank is using Celoxica’s hardware appliance-based data feed handlers to process its market and reference data. The rationale behind the decision was to speed up the processing of the ever-increasing volumes of reference data, including corporate actions, and increase efficiency, says Berry.

The increase in volatility in the market has meant that volumes of data that must be processed has increased significantly, especially in the areas of corporate actions and pricing data, elaborates Berry, who is also a key member of the Information Providers User Group (IPUG). This, in turn, has meant that data management teams are struggling under the burden of dealing with more frequent and more granular data, thus the decision to use a technology capable of processing high volumes with the least latency.

“By using technology that is traditionally used to power algorithmic trading, we were able to reduce the number of PCs on which the reference data is processed from 50 down to one,” he explains. “That means a significant cost saving in terms of technology, staffing and even electricity costs.”

This technology is more quickly able to aggregate the summary data of legal contracts underlying corporate actions, for example, and this gives UBS some degree of competitive advantage over its peers. The changes to corporate actions data are more frequent and more complex than ever before, says Berry, and this makes technology capabilities to better deal with that data ever-more important.

The pressures of more frequent risk assessments has exacerbated this challenge, as firms must now stress their models, especially around pricing, to assess P&L impacts. This also means firms can’t just rely blindly on vendors to automatically provide golden copy; firms need to set up more controls around data quality. “As the regulators push for more reporting, data becomes more of an intraday challenge and there are more parameters to take into account, including time limited parameters,” he explains.

Berry indicates that the cost pressures of the current market have meant that data managers are being challenged to find cost savings and efficiency gains where they can. The innovative use of technology is therefore just one part of a wider rationalisation programme going on at UBS in terms of evaluating currently implemented vendor solutions and determining whether they are fit for purpose. The bank has an inventory management system that allows the data team to clearly see what invoices it is receiving and how they are being allocated and therefore better judging whether they are value for money.

“We do not always go for the big names on the supplier side, as some vendors have specialist knowledge of particular areas of the market,” says Berry. “However, some also have monopolies such as Markit with its Red codes and this is why the European Commission is right to be investigating this situation. The cost of creating valuations for products using Red identifiers needs to be swallowed by someone and it is often the bank. We need to make sure the end client is aware of the charging practices for this data.”

This is a similar issue to the current negotiations going on between industry lobbyists and Bloomberg over service provider agreement (SPA) contracts. The data community seems more willing than ever to push their vendors to provide more for their money and there is more awareness now about the structure of these agreements on the part of the vendors, thus creating a natural tension.

“Over the last two years, there is more awareness that data providers are changing their prices over the lifetime of a product and therefore moving the goalposts once they have sold the data to a firm. This change has not then been embedded in the structure of the product and this can mean a really tight squeeze for the profit margins of some products. This forces firms to seek out extra basis points to cover their costs and often it ends up that the COO wears these costs,” he elaborates.

Related content

WEBINAR

Upcoming Webinar: Best practice approaches for corporate actions automation

Date: 18 November 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Demand for timely and accurate corporate actions data is growing as volumes and complexity rise, and financial institutions acknowledge the increasingly costly gap between accurate corporate actions processing in real, or near-real, time and faulty processing caused by poor...

BLOG

VendEx Launches VSource Vendor Services Directory for the Market Data Industry

San Francisco-based market data management specialist VendEx Solutions has launched VSource, an interactive digital catalogue of market data vendors, products and services, containing details of more than 3,700 active vendors at launch. VSource uses a proprietary alphanumeric taxonomy, the VID (patent pending), to capture product and service information at a granular level, enabling firms to...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...