The growing use of OTC derivatives is shaking up data management strategies across the industry. Three-quarters of firms have re-evaluated or are planning to re-evaluate their data management processes as a direct result of the expansion in use of these instruments (see chart, below), according to a survey conducted by A-Team Group (publisher of Reference Data Review).
This is one in a series of findings in the first of three focused surveys of senior reference data managers, commissioned by enterprise data management specialist GoldenSource. The first survey explores the approach to and challenges of managing instrument reference data across the enterprise.
As well as OTC derivatives, structured products and fixed income data in general featured high on the list of instruments causing data managers problems. As one data manager from a large US broker/dealer said: “We have quality data, but it is the last one to two per cent that drains our resources. Of that small percentage, new OTC instruments need a lot of the focus.”
Another data manager suggested that 50 per cent of his firm’s manual data effort goes into sourcing and resolving issues in OTC products.
More generally, while the move towards centralised data management is well under way, there are still significant challenges being faced in ensuring consistency and quality of instrument data being consumed across the enterprise. “Applying a standardised data policy across the enterprise” scored highest as a problem area in maintaining reference data, with 85.7 per cent of respondents ranking it average or above (48 per cent ranked it as the number one problem) as a concern (see chart below). Harmonising the data model also scored highly, with 73 per cent ranking it average or above (50 per cent ranked it number one).
The need for ensuring consistency of data is being driven by the overriding requirements of risk management and regulatory compliance, and is seen as necessary in order for a firm to get a true picture of where it stands. As a CIO at a major European bank put it: “We need to homogenise our reference data across operations and put standards for consistency in place. This helps manage risk and comply with regulation.” As another data manager said: “It’s a huge problem in getting consistency and common practices, but it’s down to business process, not technology.”
The quest for data consistency has led to 77 per cent of respondents already putting in place processes to ensure consistency across multiple security masters (see chart below ). The majority of respondents maintain between four and six security master files, while a third maintain up to three and nine per cent maintain a challenging 10 or more security master files.
Most respondents concur with the view that multiple master files will always exist. The difference now is that there is a real move away from the master files being maintained in a vacuum, leading to duplication of effort across the organisation on the same data, and towards central governance and standards for consistency. Several noted that the concept of distributed master files with central governance could redefine the notion of a “golden copy”.
To get your complimentary copy of the full results of this survey visit www.a-teamgroup.com/research. The next survey in the series will be focusing on counterparty data, while the third and final will focus on positions data. To take part and make your views heard, email firstname.lastname@example.org.