About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA 08: Make Reference Data Relevant To Front Office With Clear Metrics

Subscribe to our newsletter

Given the current financial crisis it was unsurprising that this year’s FIMA conference – held in London on 11 to 13 November – was focused on surviving the current financial crisis by employing better data management strategies. Although numbers were down and a number of familiar faces, such as Citi’s John Bottega, were absent, the mood remained cautiously optimistic. Data management, it seems, has been thrust into the spotlight as the result of increased regulatory scrutiny and the spectre of reputational risk.

However, delegates were warned that in order for data management to get the C-level attention it deserves, they must define clear metrics against which to measure the success of these projects. Moreover, downstream impacts must be fully considered in order for them to be relevant to the front office and senior management.

To this end, if you’re going to take a federated approach to data management in order to solve issues such as regulatory compliance, Peter Giordano of Oppenheimer & Co advises that you make your business case tangible. This can be done by including such things as “neat analytics” or dashboards for the front office, in order to help management understand the tangible results that can be gained, and thus secure funding, he told delegates.

Giordano, who is executive director of institutional equities at Oppenheimer, illustrated the point with specific business cases at his firm. For example, Oppenheimer leveraged two years’ worth of compliance efforts, which had resulted in many key data elements such as legal entity data and hierarchical structures, in order to generate client profitability analytics. “The costs of execution by client, or looking at settlement costs by client, may be disparate areas but when rolled up can become a very powerful tool,” he told delegates.

Another example he cited was the work done to support the Order Audit Trail System (OATS) yielded valuable data enabling them to view and adapt trader behaviour, monitor smart order routing techniques, use the information to negotiate with vendors, and improve profitability. For example when looking at their order routing, they realised that for three exchanges, which were expensive to maintain connections to, less than 1% of order flow went through them. A decision was made to ‘kill’ the connectivity, resulting in significant savings for the firm.

He said: “You need salesmanship to get funding for enterprise data management projects.” He suggested that institutions should look not just at the specific problem they’re solving but the long term goal and spell out the tangible results along the way. “Sometimes, however, you have to leave the complex or great ideas that develop through the project on the cutting room floor in order to deliver on time and on budget for senior management,” he added.

Once you’ve delivered your project, be sure to go back to the business case you made and ask if you’ve delivered on all of it. “It’s important to look at what was promised,” he said, otherwise it’s unlikely you’ll get support for future projects. It is also useful to open up the service to other business groups even if they didn’t participate as you can provide them with useable tools and widen the buy-in for the project.

This theme was reprised by Michael Eldridge, head of European operations control at WestLB, who suggested that in order to receive management buy-in, data management teams need to treat the front office as a client rather than as an enemy. “We need to dispel the myth that centralised data management is a luxury, build a better business case for investment by measuring the real cost of bad data and treat the front office as a client and better meet their needs,” he told FIMA 2008 delegates.

There needs to be a cultural shift in data management in order to more effectively approach senior management when attempting to get funding. Data management has traditionally been overlooked and understated, explained Eldridge, and this will continue if no action is taken to deal with underlying issues. “We unknowingly contribute to the myth that data centralisation and automation is a luxury by being too focused on the technical aspects of data. We need to instead focus on how data contributes to a firm’s P&L,” he suggested.

The real cost of bad data on institutions is measurable, he argued, giving examples such as the downstream impact of bad data in futures and options rules, which could result in missed options, and inaccurate data in currency and settlement accounts, which could result in late payments. “We need to look at what we provide and its measurable impact on the front office,” he added.

“The real areas of interest to senior management and the front office are levels of costs of exception management and of errors and claims,” he said. “We have not traditionally looked at our error rates in the same manner as other parts of the business, but this needs to be done.”

Eldridge recommended the use of a simple formula to achieve management buy-in: produce measures of technical structures (such as data volumes) plus operational risks (such as error rates), which equals data’s value to an organisation. “Another key part of this is treating the front office as you would an external client,” he said. “In order to better service them, we need to understand them.”

Rather than using technical jargon and confusing front office staff, data management teams should act as the vendors to their front office clients. Eldridge acknowledged that this would represent a significant attitude shift but warned that delegates must take heed in order to push data management projects up the corporate agenda.

Metrics are key in this equation and this was the theme of FIMA 2008’s focus day. Institutions should follow the model of courier firms such as UPS when designing their data management frameworks, suggested Dr Vinay Pampapathi, executive director of the technology division at Daiwa Securities SMBC Europe to attendees to the focus day. Using this model, institutions are able to track various data quality metrics across an institution, guarantee delivery or, at least, identify where a problem has occurred and resolve it, he explained. This is, in fact, the model that Daiwa Securities has been developing over the last few years and it is largely based on elements of “common sense”, he said.

“There are two main types of metrics to measure the success of data management projects: soft metrics such as anecdotal evidence and hard metrics such as facts and figures,” Pampapathi elaborated. “Hard metrics are much more useful in getting buy-in from management and these need to be built in to the system to see where things are going wrong.”

The two key areas for data management are consistency and integrity of data, according to Pampapathi, but functionality for the end user is paramount. “We need to be able to guarantee to the end users that consistency is guaranteed,” he said.

Once accuracy has been measured, procedures need to be established to review and resolve issues with data quality when they occur, he explained. Performance indicators then become metrics and Daiwa has tried to find the most useful numbers for the end users and internally.

“We put a lot of effort into drawing up our integrated messaging framework and this allows us to collect information on the metrics we want,” he said. “We have created our own XML messaging standards from the grass roots and this is what our framework is built upon.”

Pampapathi reckons that message logging and process audit are the most important part of data management, much like parcel tracking for couriers. However, this has to be standardised across an institution, so that developers follow the same way of working when putting in place consistency and quality control checks.

“We made sure our developers followed the same templates and guidelines to ensure the same checks were in place across our framework,” he said. “Now that these metrics can be measured, we can see where the process is being slowed down.”

Institutions can be more reactive to problems in the process in this way, contended Pampapathi. “We have also established a team dedicated to looking at these processes and fixing the problems via an iterative approach.”

As a result, there is no requirement for a large reconciliation system to be put in place and Daiwa has been able to refocus resources on resolving immediate issues, he told delegates. Typical failure rates for systems in the industry are around 6-15%, but Pampapathi said that Daiwa’s rates are under 6%, which he attributed to its data management approach. “This should improve further still, as we are still in the process of building our financial database, which should be completed in six months’ time,” he added.

The future will also see Daiwa tacking complex event processing and a focus on improving customer service by using data to tailor services more effectively, he concluded.

But it is not just internal performance that must be tracked, FIMA delegates were told; financial institutions also need to be proactive in the management of their vendor relationships, said Peter Largier, global head of reference data analysis and projects at Credit Suisse. “You need to professionally evaluate the service you are getting from your data vendors,” he warned.

Largier focused his session on how to get the most out vendor relationships via the introduction of formal procedures and discussions. “There is a need for a formal RFP and you should ask for test data to check their product and market coverage,” he continued. “A range of considerations need to be taken into account when deciding on a data vendor, including areas such as the number of individual licenses needed, whether outsourcing providers and third parties have access to the data and contract expiry arrangements – will you still have access to the data you need once it has expired?”

However, the process does not stop there, once institutions have chosen a data provider, they must then carefully monitor the data produced. “Once you have signed up, you also need to regularly check up on service levels via meetings once a month,” Largier continued. He propounded the benefits of best practices in the area of vendor communication via a single global contact and a “partnership” approach.

“There should be a consolidated list of issues that are outstanding with regards to data across the institution and these should be fed back to vendors via a formal procedure on a monthly basis,” he said. “Actions must be set to improve the quality of data and the service in minuted meetings and these documents must be distributed to senior management to keep them abreast of developments.”

Dependencies on individual vendors must be limited via standardisation of interfaces and making sure that proprietary formats are kept to a minimum, he warned. “This means that if you have to change vendors, it is a much less complex procedure.”

Institutions also need to invest in automated processes and good quality staff in order to adequately assess their risk exposure within data management process, said BNY Mellon’s Matthew Cox, head of securities data management, EMEA. They need to understand the internal risks that are posed by bad quality data and examine what vendors can and can’t provide in terms of data accuracy, he explained to the FIMA 2008 delegation.

“Identifying the source of inaccurate data is key to mitigating risk in the current environment,” he explained. “BNY Mellon does this by dual sourcing data and producing golden copy.”

All models have their issues, however, but “common sense” and stepping back from the problem rather than merely reacting to issues that arise is the best approach, he said. “We need to understand what the end client wants to use the data for and whether they pass that data onto another party,” Cox explained, reiterating the points made by many other speakers. “This is in order to more accurately measure client and financial risk, which must both be taken into account at the end of the day.”

When servicing multiple clients, this will have a significant impact on reputational risk, he continued. In the custody business, for example, client service is of critical importance and controls must be in-built to account for data discrepancies and increased risk exposure.

There is obviously more risk involved in manual processes and therefore automation and STP are one method of dealing with these risks, said Cox. “There is also a need for good quality people to be involved in the data management process but these people should not be put in a position where they often have to make judgement calls about data,” he explained.

He warned delegates that there is “no hiding place” from problems caused by data, as institutions cannot simply blame vendors: the institution must take the responsibility for its services. To make sure the data is correct, tolerance levels must be set and data must be regularly checked, he explained. “Checks and controls must be built around the areas that the most problems occur and where the risks are greatest. Finger on the pulse rather than single snapshots allow institutions to react in a more timely manner.”

It is also unrealistic to expect 100% integrity of data from vendors, he contended, as inaccuracies can be down to issues with the underlying source data.

BNY Mellon uses a data vendor scorecard with red, amber and green scores to measure the metrics being met (or not) by its vendors. “The facts speak for themselves in this way and we have control over our vendor relationships – we can prove that they need to improve in certain areas with hard evidence,” Cox explained.

Reprising Largier’s earlier point, Cox also discussed the benefits of working in partnership with the vendor community and producing detailed service level agreements to adequately measure performance at a basic level.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to trade surveillance for market abuse

Breaches of market abuse regulation can lead to reputational damage, eye-watering fines and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours; externally it can undermine confidence in markets and cause financial instability. This webinar will discuss market abuse of different types, such as insider trading...

BLOG

HSBC Extends Partnership with Silent Eight to Include AI-based Transaction Screening Solutions

HSBC has expanded its partnership with Silent Eight, a RegTech firm that partners financial institutions to fight financial crime, with the implementation of the company’s Automated Alert Closure for Transactions, a solution that automates investigation and resolution of alerts in real time. The company already provides HSBC with name screening and adverse media automation solutions....

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...