The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Daiwa Securities’ Next Focus for Data Quality Measurement is on Derivatives, Says Pampathi

Share article

Following the establishment of its data quality measurement framework last year, Daiwa Securities is now looking to expand its data management functions to the area of derivatives, explained Vinay Pampathi, executive director of the firm’s technology division, to delegates to last week’s Marcus Evans Reference Data conference. “Over the next six months we will be actively looking to move into derivatives and this will provide an interesting challenge for reference data management and IT,” he said.

Pampathi explained the ideas behind the concept of data quality and how Daiwa Securities used these to structure its data management framework. The focus of data quality for the firm is therefore on content, consistency, completeness and uniqueness and these are the metrics by which it is measures, he said.

“Gold copy is the starting point but you need to measure how the data is being used downstream and feed that back into the data management system in order to provide real value,” he told delegates.

As the firm is relatively small and has a wide range of product coverage, Pampathi explained that it was a significant challenge to run a “lean operation” with regards to maintaining data quality. With regards to achieving consistency and completeness, firms have to first understand what the end users want from the data, he elaborated. “It took us several attempts to get it right and it was one hell of a task to agree on a common data set across different user groups.”

Rules and validation also needs to be in place by making use of workflows to provide checks and balances for the data, he continued. To avoid duplication of data, there need to be unique identifiers across a financial institution: “This is easier for externally provided instrument data but internal data is more difficult.”

Vendor data must be cross checked during the loading process via the introduction of rules. “This should be achieved in partnership with your data suppliers and communication is important in these relationships,” cautioned Pampathi. “It has been difficult to achieve clean data before the golden copy is created and we often have to eliminate duplicative records after the event.”

On the other hand, it is easier to ensure data integrity by achieving consistency across all systems using the data. “You need to ensure your data is stored in one place and you have a good distribution mechanism that is suitable for end users’ purposes but does not allow them to alter the gold copy,” he explained.

In order to ensure data accuracy you need to have a system that can compare data by looking at how it is used. “Is the data fit for purpose? You can have all the fields filled in incorrectly,” he warned. “The business only cares if it can trade and settle. One way of measuring accuracy is examining vendors’ data provision and reviewing the breaks and failures and feeding back that data into the system.”

Daiwa has two different golden copies, said Pampathi, one for reference data and one for financial and trading activity. The data is distributed in real time or on a scheduled basis, dependent on the data involved, and the integration message framework can be used to view STP rates and failures within the data itself.

Rather than centralising data management, Daiwa opted to stick with silos and built rules around workflows for the data. “There are reports available to users to see how accurate the data is within these workflows,” he said. “There are also dashboards by which to monitor the end to end trade handling process by STP rates and performance indicators.”

Pampathi added: “We also do not distribute all the data to the end users, it is filtered by job function.”

The majority of the problems that crop up at the moment are to do with inaccuracies in vendor data, especially with regards to corporate actions and pricing data, he explained. The lack of liquidity in the market is causing issues around pricing data and corporate actions data has become more important with regards to a change in trading strategy and a focus on proprietary trading within Daiwa, said Pampathi.

“Our next immediate area of focus is on the last stage of the trading lifecycle – the settlement stage – and getting that into the data management workflow. This would give us the final nail in the framework for measuring data accuracy,” he concluded.

Related content

WEBINAR

Upcoming Webinar: Managing the transaction reporting landscape post Brexit: MiFID II, SFTR, EMIR

Date: 16 March 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes The transaction reporting landscape has, for many financial institutions, expanded considerably in size since the end of the UK’s Brexit transition period on 31 December 2020 and the resulting need for double reporting of some transactions to both EU...

BLOG

Alveo Adds Postgres to Roadmap Combining Open Source Components with Core Data Management Solutions

Alveo, formerly Asset Control, continues to build its commitment to open source solutions with the addition of support for the Postgres open source relational database within its Prime, formerly AC Plus, financial data aggregation and mastering product. Postgres provides an optional replacement for Oracle database technology used by Alveo and its clients, and can reduce...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...