The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

OTC Derivatives Shake Up Data Management; Data Consistency Key

Share article

The growing use of OTC derivatives is shaking up data management strategies across the industry. Three-quarters of firms have re-evaluated or are planning to re-evaluate their data management processes as a direct result of the expansion in use of these instruments (see chart, below), according to a survey conducted by A-Team Group (publisher of Reference Data Review).

gs_03.jpg
This is one in a series of findings in the first of three focused surveys of senior reference data managers, commissioned by enterprise data management specialist GoldenSource. The first survey explores the approach to and challenges of managing instrument reference data across the enterprise.

As well as OTC derivatives, structured products and fixed income data in general featured high on the list of instruments causing data managers problems. As one data manager from a large US broker/dealer said: “We have quality data, but it is the last one to two per cent that drains our resources. Of that small percentage, new OTC instruments need a lot of the focus.”

Another data manager suggested that 50 per cent of his firm’s manual data effort goes into sourcing and resolving issues in OTC products.

More generally, while the move towards centralised data management is well under way, there are still significant challenges being faced in ensuring consistency and quality of instrument data being consumed across the enterprise. “Applying a standardised data policy across the enterprise” scored highest as a problem area in maintaining reference data, with 85.7 per cent of respondents ranking it average or above (48 per cent ranked it as the number one problem) as a concern (see chart below). Harmonising the data model also scored highly, with 73 per cent ranking it average or above (50 per cent ranked it number one).

The need for ensuring consistency of data is being driven by the overriding requirements of risk management and regulatory compliance, and is seen as necessary in order for a firm to get a true picture of where it stands. As a CIO at a major European bank put it: “We need to homogenise our reference data across operations and put standards for consistency in place. This helps manage risk and comply with regulation.” As another data manager said: “It’s a huge problem in getting consistency and common practices, but it’s down to business process, not technology.”

The quest for data consistency has led to 77 per cent of respondents already putting in place processes to ensure consistency across multiple security masters (see chart below ). The majority of respondents maintain between four and six security master files, while a third maintain up to three and nine per cent maintain a challenging 10 or more security master files.

Most respondents concur with the view that multiple master files will always exist. The difference now is that there is a real move away from the master files being maintained in a vacuum, leading to duplication of effort across the organisation on the same data, and towards central governance and standards for consistency. Several noted that the concept of distributed master files with central governance could redefine the notion of a “golden copy”.

To get your complimentary copy of the full results of this survey visit www.a-teamgroup.com/research. The next survey in the series will be focusing on counterparty data, while the third and final will focus on positions data. To take part and make your views heard, email surveys@a-teamgroup.com.

Related content

WEBINAR

Recorded Webinar: Last minute preparations for SFTR: What still needs to be done and are we ready?

The regulation clock is ticking. Financial firms, especially those subject to Phase I of implementation, are well aware of the impending April 2020 deadline for the Securities Financing Transactions Regulation. The question is, are they ready? Tactical, i.e painful, approaches to compliance won’t be good enough. A strategic plan of attack is necessary to combat...

BLOG

Debut Virtual Data Management Summit Highlights Industry Agility

In an extraordinary achievement that attracted over 1,000 delegates and well over 20,000 views, A-Team Group’s debut Virtual Data Management Summit went live last week to showcase the most cutting-edge content from an industry that has shown itself to be both resilient and responsive to the unique challenges presented by the ongoing coronavirus epidemic. Focusing...

EVENT

Data Management Summit London

Now in its 10th year, the Data Management Summit (DMS) in London explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

MiFID II Handbook – Second Edition

With the compliance deadline for Markets in Financial Instruments Directive II (MiFID II) just over two months away, A-Team Group has updated its MiFID II handbook to bring you the latest details on the regulation’s compliance requirements. Version 2 of the handbook, commissioned by Thomson Reuters, also includes new sections covering data sourcing and data...