About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Daiwa Securities’ Next Focus for Data Quality Measurement is on Derivatives, Says Pampathi

Subscribe to our newsletter

Following the establishment of its data quality measurement framework last year, Daiwa Securities is now looking to expand its data management functions to the area of derivatives, explained Vinay Pampathi, executive director of the firm’s technology division, to delegates to last week’s Marcus Evans Reference Data conference. “Over the next six months we will be actively looking to move into derivatives and this will provide an interesting challenge for reference data management and IT,” he said.

Pampathi explained the ideas behind the concept of data quality and how Daiwa Securities used these to structure its data management framework. The focus of data quality for the firm is therefore on content, consistency, completeness and uniqueness and these are the metrics by which it is measures, he said.

“Gold copy is the starting point but you need to measure how the data is being used downstream and feed that back into the data management system in order to provide real value,” he told delegates.

As the firm is relatively small and has a wide range of product coverage, Pampathi explained that it was a significant challenge to run a “lean operation” with regards to maintaining data quality. With regards to achieving consistency and completeness, firms have to first understand what the end users want from the data, he elaborated. “It took us several attempts to get it right and it was one hell of a task to agree on a common data set across different user groups.”

Rules and validation also needs to be in place by making use of workflows to provide checks and balances for the data, he continued. To avoid duplication of data, there need to be unique identifiers across a financial institution: “This is easier for externally provided instrument data but internal data is more difficult.”

Vendor data must be cross checked during the loading process via the introduction of rules. “This should be achieved in partnership with your data suppliers and communication is important in these relationships,” cautioned Pampathi. “It has been difficult to achieve clean data before the golden copy is created and we often have to eliminate duplicative records after the event.”

On the other hand, it is easier to ensure data integrity by achieving consistency across all systems using the data. “You need to ensure your data is stored in one place and you have a good distribution mechanism that is suitable for end users’ purposes but does not allow them to alter the gold copy,” he explained.

In order to ensure data accuracy you need to have a system that can compare data by looking at how it is used. “Is the data fit for purpose? You can have all the fields filled in incorrectly,” he warned. “The business only cares if it can trade and settle. One way of measuring accuracy is examining vendors’ data provision and reviewing the breaks and failures and feeding back that data into the system.”

Daiwa has two different golden copies, said Pampathi, one for reference data and one for financial and trading activity. The data is distributed in real time or on a scheduled basis, dependent on the data involved, and the integration message framework can be used to view STP rates and failures within the data itself.

Rather than centralising data management, Daiwa opted to stick with silos and built rules around workflows for the data. “There are reports available to users to see how accurate the data is within these workflows,” he said. “There are also dashboards by which to monitor the end to end trade handling process by STP rates and performance indicators.”

Pampathi added: “We also do not distribute all the data to the end users, it is filtered by job function.”

The majority of the problems that crop up at the moment are to do with inaccuracies in vendor data, especially with regards to corporate actions and pricing data, he explained. The lack of liquidity in the market is causing issues around pricing data and corporate actions data has become more important with regards to a change in trading strategy and a focus on proprietary trading within Daiwa, said Pampathi.

“Our next immediate area of focus is on the last stage of the trading lifecycle – the settlement stage – and getting that into the data management workflow. This would give us the final nail in the framework for measuring data accuracy,” he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

Redwheel Extends Rimes Partnership with Integration of Matrix Investment Management Platform

Redwheel, an active investment manager and existing Rimes client for benchmark data services, has extended the relationship to include Rimes’ Matrix investment management platform. The platform will replace legacy data warehouse technology and help Redwheel establish a robust investment data intelligence function and systematic data management foundation. Within the extended partnership, Rimes will address Redwheel’s...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...