About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pampapathi Advocates UPS Courier Model for Data Metrics Projects to FIMA 2008 Delegation

Subscribe to our newsletter

Institutions should follow the model of courier firms such as UPS when designing their data management frameworks, Dr Vinay Pampapathi, executive director of the technology division at Daiwa Securities SMBC Europe, told the delegates in attendance at this morning’s opening session to FIMA 2008.

Using this model, institutions are able to track various data quality metrics across an institution, guarantee delivery or, at least, identify where a problem has occurred and resolve it, he explained. This is, in fact, the model that Daiwa Securities has been developing over the last few years and it is largely based on elements of “common sense”, he said.

“There are two main types of metrics to measure the success of data management projects: soft metrics such as anecdotal evidence and hard metrics such as facts and figures,” Pampapathi elaborated. “Hard metrics are much more useful in getting buy-in from management and these need to be built in to the system to see where things are going wrong.”

The two key areas for data management are consistency and integrity of data, according to Pampapathi, but functionality for the end user is paramount. “We need to be able to guarantee to the end users that consistency is guaranteed,” he said.

Once accuracy has been measured, procedures need to be established to review and resolve issues with data quality when they occur, he explained. Performance indicators then become metrics and Daiwa has tried to find the most useful numbers for the end users and internally.

“We put a lot of effort into drawing up our integrated messaging framework and this allows us to collect information on the metrics we want,” he said. “We have created our own XML messaging standards from the grass roots and this is what our framework is built upon.”

Pampapathi reckons that message logging and process audit are the most important part of data management, much like parcel tracking for couriers. However, this has to be standardised across an institution, so that developers follow the same way of working when putting in place consistency and quality control checks.

“We made sure our developers followed the same templates and guidelines to ensure the same checks were in place across our framework,” he said. “Now that these metrics can be measured, we can see where the process is being slowed down.”

Institutions can be more reactive to problems in the process in this way, contended Pampapathi. “We have also established a team dedicated to looking at these processes and fixing the problems via an iterative approach.”

As a result, there is no requirement for a large reconciliation system to be put in place and Daiwa has been able to refocus resources on resolving immediate issues, he told delegates. Typical failure rates for systems in the industry are around 6-15%, but Pampapathi said that Daiwa’s rates are under 6%, which he attributed to its data management approach. “This should improve further still, as we are still in the process of building our financial database, which should be completed in six months’ time,” he added.

The future will also see Daiwa tacking complex event processing and a focus on improving customer service by using data to tailor services more effectively, he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

The Year in Data: 2025’s Biggest Trends and Developments

The past 12 months saw breakneck developments in how firms applied artificial intelligence. AI began to change from a mere tool to an integral part of capital markets operations. The year also saw data services providers launch multiple products for the growing private markets investment sector. Data Management Insight spoke to leaders in our industry...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...