About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pampapathi Advocates UPS Courier Model for Data Metrics Projects to FIMA 2008 Delegation

Subscribe to our newsletter

Institutions should follow the model of courier firms such as UPS when designing their data management frameworks, Dr Vinay Pampapathi, executive director of the technology division at Daiwa Securities SMBC Europe, told the delegates in attendance at this morning’s opening session to FIMA 2008.

Using this model, institutions are able to track various data quality metrics across an institution, guarantee delivery or, at least, identify where a problem has occurred and resolve it, he explained. This is, in fact, the model that Daiwa Securities has been developing over the last few years and it is largely based on elements of “common sense”, he said.

“There are two main types of metrics to measure the success of data management projects: soft metrics such as anecdotal evidence and hard metrics such as facts and figures,” Pampapathi elaborated. “Hard metrics are much more useful in getting buy-in from management and these need to be built in to the system to see where things are going wrong.”

The two key areas for data management are consistency and integrity of data, according to Pampapathi, but functionality for the end user is paramount. “We need to be able to guarantee to the end users that consistency is guaranteed,” he said.

Once accuracy has been measured, procedures need to be established to review and resolve issues with data quality when they occur, he explained. Performance indicators then become metrics and Daiwa has tried to find the most useful numbers for the end users and internally.

“We put a lot of effort into drawing up our integrated messaging framework and this allows us to collect information on the metrics we want,” he said. “We have created our own XML messaging standards from the grass roots and this is what our framework is built upon.”

Pampapathi reckons that message logging and process audit are the most important part of data management, much like parcel tracking for couriers. However, this has to be standardised across an institution, so that developers follow the same way of working when putting in place consistency and quality control checks.

“We made sure our developers followed the same templates and guidelines to ensure the same checks were in place across our framework,” he said. “Now that these metrics can be measured, we can see where the process is being slowed down.”

Institutions can be more reactive to problems in the process in this way, contended Pampapathi. “We have also established a team dedicated to looking at these processes and fixing the problems via an iterative approach.”

As a result, there is no requirement for a large reconciliation system to be put in place and Daiwa has been able to refocus resources on resolving immediate issues, he told delegates. Typical failure rates for systems in the industry are around 6-15%, but Pampapathi said that Daiwa’s rates are under 6%, which he attributed to its data management approach. “This should improve further still, as we are still in the process of building our financial database, which should be completed in six months’ time,” he added.

The future will also see Daiwa tacking complex event processing and a focus on improving customer service by using data to tailor services more effectively, he concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Data Concern Over EU’s Streamlining of Green Regulations

Financial institutions may have to rely more heavily on their data teams and vendors to surface sustainability risks in their portfolios after the European Union watered down some of its key corporate ESG reporting regulations. The EU’s Omnibus package announced earlier this year is intended to streamline the compliance processes for regulations including the Corporate...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...