About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

OTC Derivatives Shake Up Data Management; Data Consistency Key

Subscribe to our newsletter

The growing use of OTC derivatives is shaking up data management strategies across the industry. Three-quarters of firms have re-evaluated or are planning to re-evaluate their data management processes as a direct result of the expansion in use of these instruments (see chart, below), according to a survey conducted by A-Team Group (publisher of Reference Data Review).

gs_03.jpg
This is one in a series of findings in the first of three focused surveys of senior reference data managers, commissioned by enterprise data management specialist GoldenSource. The first survey explores the approach to and challenges of managing instrument reference data across the enterprise.

As well as OTC derivatives, structured products and fixed income data in general featured high on the list of instruments causing data managers problems. As one data manager from a large US broker/dealer said: “We have quality data, but it is the last one to two per cent that drains our resources. Of that small percentage, new OTC instruments need a lot of the focus.”

Another data manager suggested that 50 per cent of his firm’s manual data effort goes into sourcing and resolving issues in OTC products.

More generally, while the move towards centralised data management is well under way, there are still significant challenges being faced in ensuring consistency and quality of instrument data being consumed across the enterprise. “Applying a standardised data policy across the enterprise” scored highest as a problem area in maintaining reference data, with 85.7 per cent of respondents ranking it average or above (48 per cent ranked it as the number one problem) as a concern (see chart below). Harmonising the data model also scored highly, with 73 per cent ranking it average or above (50 per cent ranked it number one).

The need for ensuring consistency of data is being driven by the overriding requirements of risk management and regulatory compliance, and is seen as necessary in order for a firm to get a true picture of where it stands. As a CIO at a major European bank put it: “We need to homogenise our reference data across operations and put standards for consistency in place. This helps manage risk and comply with regulation.” As another data manager said: “It’s a huge problem in getting consistency and common practices, but it’s down to business process, not technology.”

The quest for data consistency has led to 77 per cent of respondents already putting in place processes to ensure consistency across multiple security masters (see chart below ). The majority of respondents maintain between four and six security master files, while a third maintain up to three and nine per cent maintain a challenging 10 or more security master files.

Most respondents concur with the view that multiple master files will always exist. The difference now is that there is a real move away from the master files being maintained in a vacuum, leading to duplication of effort across the organisation on the same data, and towards central governance and standards for consistency. Several noted that the concept of distributed master files with central governance could redefine the notion of a “golden copy”.

To get your complimentary copy of the full results of this survey visit www.a-teamgroup.com/research. The next survey in the series will be focusing on counterparty data, while the third and final will focus on positions data. To take part and make your views heard, email surveys@a-teamgroup.com.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

17 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise...

BLOG

Strong Governance, Privacy Policies Can Negate AI Risks, Informatica Says

Debate about the limitations of artificial intelligence (AI) in data management was stoked further this week when a leading vendor warned that applications built on nascent large language model (LLM) technology could pose an “existential threat” to companies if not deployed thoughtfully. Jason du Preez, vice president of privacy and security at cloud data management...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...