About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

OTC Derivatives Shake Up Data Management; Data Consistency Key

Subscribe to our newsletter

The growing use of OTC derivatives is shaking up data management strategies across the industry. Three-quarters of firms have re-evaluated or are planning to re-evaluate their data management processes as a direct result of the expansion in use of these instruments (see chart, below), according to a survey conducted by A-Team Group (publisher of Reference Data Review).

gs_03.jpg
This is one in a series of findings in the first of three focused surveys of senior reference data managers, commissioned by enterprise data management specialist GoldenSource. The first survey explores the approach to and challenges of managing instrument reference data across the enterprise.

As well as OTC derivatives, structured products and fixed income data in general featured high on the list of instruments causing data managers problems. As one data manager from a large US broker/dealer said: “We have quality data, but it is the last one to two per cent that drains our resources. Of that small percentage, new OTC instruments need a lot of the focus.”

Another data manager suggested that 50 per cent of his firm’s manual data effort goes into sourcing and resolving issues in OTC products.

More generally, while the move towards centralised data management is well under way, there are still significant challenges being faced in ensuring consistency and quality of instrument data being consumed across the enterprise. “Applying a standardised data policy across the enterprise” scored highest as a problem area in maintaining reference data, with 85.7 per cent of respondents ranking it average or above (48 per cent ranked it as the number one problem) as a concern (see chart below). Harmonising the data model also scored highly, with 73 per cent ranking it average or above (50 per cent ranked it number one).

The need for ensuring consistency of data is being driven by the overriding requirements of risk management and regulatory compliance, and is seen as necessary in order for a firm to get a true picture of where it stands. As a CIO at a major European bank put it: “We need to homogenise our reference data across operations and put standards for consistency in place. This helps manage risk and comply with regulation.” As another data manager said: “It’s a huge problem in getting consistency and common practices, but it’s down to business process, not technology.”

The quest for data consistency has led to 77 per cent of respondents already putting in place processes to ensure consistency across multiple security masters (see chart below ). The majority of respondents maintain between four and six security master files, while a third maintain up to three and nine per cent maintain a challenging 10 or more security master files.

Most respondents concur with the view that multiple master files will always exist. The difference now is that there is a real move away from the master files being maintained in a vacuum, leading to duplication of effort across the organisation on the same data, and towards central governance and standards for consistency. Several noted that the concept of distributed master files with central governance could redefine the notion of a “golden copy”.

To get your complimentary copy of the full results of this survey visit www.a-teamgroup.com/research. The next survey in the series will be focusing on counterparty data, while the third and final will focus on positions data. To take part and make your views heard, email surveys@a-teamgroup.com.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Data Products: Transforming Fraud Detection in Financial Services

By Suki Dhuphar, Head of EMEA, Tamr. Cybercrime assumes many shapes and forms. As a result, it’s often challenging to identify fraudulent behaviour and subsequently address it. Traditional methods frequently fail to detect and combat illicit activities, leading to financial losses and eroded trust. Yet, even today, one of the most prevalent solutions to enhance...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...