The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

UBS Using Front Office Technology to Deal With Reference Data Processing, Says Berry

In a move that confirms that the worlds of the front office and the back office are moving ever closer, UBS’ executive in charge of market data sourcing and strategy, David Berry has indicated that the bank is using Celoxica’s hardware appliance-based data feed handlers to process its market and reference data. The rationale behind the decision was to speed up the processing of the ever-increasing volumes of reference data, including corporate actions, and increase efficiency, says Berry.

The increase in volatility in the market has meant that volumes of data that must be processed has increased significantly, especially in the areas of corporate actions and pricing data, elaborates Berry, who is also a key member of the Information Providers User Group (IPUG). This, in turn, has meant that data management teams are struggling under the burden of dealing with more frequent and more granular data, thus the decision to use a technology capable of processing high volumes with the least latency.

“By using technology that is traditionally used to power algorithmic trading, we were able to reduce the number of PCs on which the reference data is processed from 50 down to one,” he explains. “That means a significant cost saving in terms of technology, staffing and even electricity costs.”

This technology is more quickly able to aggregate the summary data of legal contracts underlying corporate actions, for example, and this gives UBS some degree of competitive advantage over its peers. The changes to corporate actions data are more frequent and more complex than ever before, says Berry, and this makes technology capabilities to better deal with that data ever-more important.

The pressures of more frequent risk assessments has exacerbated this challenge, as firms must now stress their models, especially around pricing, to assess P&L impacts. This also means firms can’t just rely blindly on vendors to automatically provide golden copy; firms need to set up more controls around data quality. “As the regulators push for more reporting, data becomes more of an intraday challenge and there are more parameters to take into account, including time limited parameters,” he explains.

Berry indicates that the cost pressures of the current market have meant that data managers are being challenged to find cost savings and efficiency gains where they can. The innovative use of technology is therefore just one part of a wider rationalisation programme going on at UBS in terms of evaluating currently implemented vendor solutions and determining whether they are fit for purpose. The bank has an inventory management system that allows the data team to clearly see what invoices it is receiving and how they are being allocated and therefore better judging whether they are value for money.

“We do not always go for the big names on the supplier side, as some vendors have specialist knowledge of particular areas of the market,” says Berry. “However, some also have monopolies such as Markit with its Red codes and this is why the European Commission is right to be investigating this situation. The cost of creating valuations for products using Red identifiers needs to be swallowed by someone and it is often the bank. We need to make sure the end client is aware of the charging practices for this data.”

This is a similar issue to the current negotiations going on between industry lobbyists and Bloomberg over service provider agreement (SPA) contracts. The data community seems more willing than ever to push their vendors to provide more for their money and there is more awareness now about the structure of these agreements on the part of the vendors, thus creating a natural tension.

“Over the last two years, there is more awareness that data providers are changing their prices over the lifetime of a product and therefore moving the goalposts once they have sold the data to a firm. This change has not then been embedded in the structure of the product and this can mean a really tight squeeze for the profit margins of some products. This forces firms to seek out extra basis points to cover their costs and often it ends up that the COO wears these costs,” he elaborates.

Related content

WEBINAR

Upcoming Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Date: 8 June 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG...

BLOG

RavenPack Report Lists 12 Use-Cases for Alt-Data-Driven Analytics

Analytics provider RavenPack has identified a dozen use-cases for traders seeking to profit from alternative data. In a new report – ‘12 Validated New Ways to Capture Alpha with Alternative Data’ – the company demonstrate how news analytics provides original sources of alpha not already factored in by existing indicators and models. The report –...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...