About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pampapathi Advocates UPS Courier Model for Data Metrics Projects to FIMA 2008 Delegation

Subscribe to our newsletter

Institutions should follow the model of courier firms such as UPS when designing their data management frameworks, Dr Vinay Pampapathi, executive director of the technology division at Daiwa Securities SMBC Europe, told the delegates in attendance at this morning’s opening session to FIMA 2008.

Using this model, institutions are able to track various data quality metrics across an institution, guarantee delivery or, at least, identify where a problem has occurred and resolve it, he explained. This is, in fact, the model that Daiwa Securities has been developing over the last few years and it is largely based on elements of “common sense”, he said.

“There are two main types of metrics to measure the success of data management projects: soft metrics such as anecdotal evidence and hard metrics such as facts and figures,” Pampapathi elaborated. “Hard metrics are much more useful in getting buy-in from management and these need to be built in to the system to see where things are going wrong.”

The two key areas for data management are consistency and integrity of data, according to Pampapathi, but functionality for the end user is paramount. “We need to be able to guarantee to the end users that consistency is guaranteed,” he said.

Once accuracy has been measured, procedures need to be established to review and resolve issues with data quality when they occur, he explained. Performance indicators then become metrics and Daiwa has tried to find the most useful numbers for the end users and internally.

“We put a lot of effort into drawing up our integrated messaging framework and this allows us to collect information on the metrics we want,” he said. “We have created our own XML messaging standards from the grass roots and this is what our framework is built upon.”

Pampapathi reckons that message logging and process audit are the most important part of data management, much like parcel tracking for couriers. However, this has to be standardised across an institution, so that developers follow the same way of working when putting in place consistency and quality control checks.

“We made sure our developers followed the same templates and guidelines to ensure the same checks were in place across our framework,” he said. “Now that these metrics can be measured, we can see where the process is being slowed down.”

Institutions can be more reactive to problems in the process in this way, contended Pampapathi. “We have also established a team dedicated to looking at these processes and fixing the problems via an iterative approach.”

As a result, there is no requirement for a large reconciliation system to be put in place and Daiwa has been able to refocus resources on resolving immediate issues, he told delegates. Typical failure rates for systems in the industry are around 6-15%, but Pampapathi said that Daiwa’s rates are under 6%, which he attributed to its data management approach. “This should improve further still, as we are still in the process of building our financial database, which should be completed in six months’ time,” he added.

The future will also see Daiwa tacking complex event processing and a focus on improving customer service by using data to tailor services more effectively, he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for creating an effective data quality control framework

Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework that includes an automated and systematic process that monitors the state of data quality and ensures...

BLOG

Bloomberg Makes Data License Content Available Natively on Google Cloud

Bloomberg has extended its relationship with Google Cloud by making its Data License content available on the cloud, a move designed to enable customers to receive content natively and reduce time taken to integrate data and derive insights. Data License customers can receive Bloomberg OneData content, including reference, pricing, regulatory data, research, corporate actions, and...

EVENT

FinCrime Tech Briefing, New York

RegTech Insight (from A-Team Group) is proud to announce the launch of its FinCrime Tech Briefing taking place in both London and New York this summer and focusing on RegTech for AML and Financial Crime Compliance.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.