The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Pampapathi Advocates UPS Courier Model for Data Metrics Projects to FIMA 2008 Delegation

Institutions should follow the model of courier firms such as UPS when designing their data management frameworks, Dr Vinay Pampapathi, executive director of the technology division at Daiwa Securities SMBC Europe, told the delegates in attendance at this morning’s opening session to FIMA 2008.

Using this model, institutions are able to track various data quality metrics across an institution, guarantee delivery or, at least, identify where a problem has occurred and resolve it, he explained. This is, in fact, the model that Daiwa Securities has been developing over the last few years and it is largely based on elements of “common sense”, he said.

“There are two main types of metrics to measure the success of data management projects: soft metrics such as anecdotal evidence and hard metrics such as facts and figures,” Pampapathi elaborated. “Hard metrics are much more useful in getting buy-in from management and these need to be built in to the system to see where things are going wrong.”

The two key areas for data management are consistency and integrity of data, according to Pampapathi, but functionality for the end user is paramount. “We need to be able to guarantee to the end users that consistency is guaranteed,” he said.

Once accuracy has been measured, procedures need to be established to review and resolve issues with data quality when they occur, he explained. Performance indicators then become metrics and Daiwa has tried to find the most useful numbers for the end users and internally.

“We put a lot of effort into drawing up our integrated messaging framework and this allows us to collect information on the metrics we want,” he said. “We have created our own XML messaging standards from the grass roots and this is what our framework is built upon.”

Pampapathi reckons that message logging and process audit are the most important part of data management, much like parcel tracking for couriers. However, this has to be standardised across an institution, so that developers follow the same way of working when putting in place consistency and quality control checks.

“We made sure our developers followed the same templates and guidelines to ensure the same checks were in place across our framework,” he said. “Now that these metrics can be measured, we can see where the process is being slowed down.”

Institutions can be more reactive to problems in the process in this way, contended Pampapathi. “We have also established a team dedicated to looking at these processes and fixing the problems via an iterative approach.”

As a result, there is no requirement for a large reconciliation system to be put in place and Daiwa has been able to refocus resources on resolving immediate issues, he told delegates. Typical failure rates for systems in the industry are around 6-15%, but Pampapathi said that Daiwa’s rates are under 6%, which he attributed to its data management approach. “This should improve further still, as we are still in the process of building our financial database, which should be completed in six months’ time,” he added.

The future will also see Daiwa tacking complex event processing and a focus on improving customer service by using data to tailor services more effectively, he concluded.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Global LEI Foundation Aims to Accelerate LEI Adoption by Slashing Prices of the Identifiers to Single Digit Dollars

The Global LEI Foundation (GLEIF) is planning to tear down the cost barrier obstructing widespread adoption of the LEI with the implementation of a validation agent role for banks and financial institutions. Expectations are that frequently contended high prices charged to register an entity for an LEI – by way of example Bloomberg charges $65...

EVENT

Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...