About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Daiwa Securities’ Next Focus for Data Quality Measurement is on Derivatives, Says Pampathi

Subscribe to our newsletter

Following the establishment of its data quality measurement framework last year, Daiwa Securities is now looking to expand its data management functions to the area of derivatives, explained Vinay Pampathi, executive director of the firm’s technology division, to delegates to last week’s Marcus Evans Reference Data conference. “Over the next six months we will be actively looking to move into derivatives and this will provide an interesting challenge for reference data management and IT,” he said.

Pampathi explained the ideas behind the concept of data quality and how Daiwa Securities used these to structure its data management framework. The focus of data quality for the firm is therefore on content, consistency, completeness and uniqueness and these are the metrics by which it is measures, he said.

“Gold copy is the starting point but you need to measure how the data is being used downstream and feed that back into the data management system in order to provide real value,” he told delegates.

As the firm is relatively small and has a wide range of product coverage, Pampathi explained that it was a significant challenge to run a “lean operation” with regards to maintaining data quality. With regards to achieving consistency and completeness, firms have to first understand what the end users want from the data, he elaborated. “It took us several attempts to get it right and it was one hell of a task to agree on a common data set across different user groups.”

Rules and validation also needs to be in place by making use of workflows to provide checks and balances for the data, he continued. To avoid duplication of data, there need to be unique identifiers across a financial institution: “This is easier for externally provided instrument data but internal data is more difficult.”

Vendor data must be cross checked during the loading process via the introduction of rules. “This should be achieved in partnership with your data suppliers and communication is important in these relationships,” cautioned Pampathi. “It has been difficult to achieve clean data before the golden copy is created and we often have to eliminate duplicative records after the event.”

On the other hand, it is easier to ensure data integrity by achieving consistency across all systems using the data. “You need to ensure your data is stored in one place and you have a good distribution mechanism that is suitable for end users’ purposes but does not allow them to alter the gold copy,” he explained.

In order to ensure data accuracy you need to have a system that can compare data by looking at how it is used. “Is the data fit for purpose? You can have all the fields filled in incorrectly,” he warned. “The business only cares if it can trade and settle. One way of measuring accuracy is examining vendors’ data provision and reviewing the breaks and failures and feeding back that data into the system.”

Daiwa has two different golden copies, said Pampathi, one for reference data and one for financial and trading activity. The data is distributed in real time or on a scheduled basis, dependent on the data involved, and the integration message framework can be used to view STP rates and failures within the data itself.

Rather than centralising data management, Daiwa opted to stick with silos and built rules around workflows for the data. “There are reports available to users to see how accurate the data is within these workflows,” he said. “There are also dashboards by which to monitor the end to end trade handling process by STP rates and performance indicators.”

Pampathi added: “We also do not distribute all the data to the end users, it is filtered by job function.”

The majority of the problems that crop up at the moment are to do with inaccuracies in vendor data, especially with regards to corporate actions and pricing data, he explained. The lack of liquidity in the market is causing issues around pricing data and corporate actions data has become more important with regards to a change in trading strategy and a focus on proprietary trading within Daiwa, said Pampathi.

“Our next immediate area of focus is on the last stage of the trading lifecycle – the settlement stage – and getting that into the data management workflow. This would give us the final nail in the framework for measuring data accuracy,” he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New opportunities to scale data operations

Faced with tough competition and ongoing pressure on margins, many firms are reviewing their operating models and assessing whether they can reallocate more resources to high-value projects by outsourcing commoditised processes including data operations. This webinar will explore the different approaches that buy-side and sell-side firms are adopting to scale their data operations, including market...

BLOG

EMIR Refit is on the Horizon – Is Your Organisation Ready? Find Out at Next Week’s RegTech Summit London

European Market Infrastructure Regulation Refit (EMIR Refit) is on the horizon with new reporting standards including more reporting fields, a change in reporting format from CSV to XML, an increase in the number of trading repository reconciliation fields for pairing a matching, and mandatory use of the Unique Product Identifier (UPI), which moves into production...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...