The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FIMA Event Delivers on Promise of Broad View of Reference Data

The Financial Information Management conference (FIMA 2003) held in London at the beginning of this month held much promise for executives anxious to learn about the intricacies of reference data and the benefits of its application. It didn’t disappoint.

Speakers from ABN Amro, Barclays Capital, Robeco Asset Management, Merrill Lynch, the U.K. Financial Services Association (FSA), Axa Investments and Reuters featured in a particularly strong programme. Topics ranged from examples and case studies, to the use of accurate reference data to enable straight-through-processing (STP) and reduce trade exceptions.

Creating a Workflow System

Among the client-side firms contributing to the event, Martin Kruit, director and head of European reference data at ABN Amro presented on ‘Creating a Workflow System to Maintain the Integrity of Your Reference Data and Dramatically Reduce the Rate of Trade Failures.’ Kruit argued that a workflow system can help maintain reference data and distribute data around the enterprise.

He outlined the benefits of an effective workflow system, using a recent project at ABN Amro as an example. The bank’s incumbent reference data tool had been built with Lotus Notes and did not include three requirements: “workflow control”, “mandatory authorisation levels” for compliance and risk, or “automatic MIS data”.

Uniform identifier

To start the project, ABN Amro “established a uniform identifier for all legal entities throughout the bank (GID), which we are also implementing at sub-account level”, said Kruit.

In addition to meeting the bank’s three main requirements, the workflow project aimed to achieve a number of other goals. Those included assigning the GID at the beginning of the process and removing the data entry bottleneck from reference data entry, said Kruit.

“STP (automated dealflow through (to) back-office systems) was not a goal itself, but we expect it will benefit,” he added. “Also, reduced trade failure is an expected – but at this time not significantly proven – benefit. As long as data entry remains manual (as in our case) I do not expect much improvement in trade failure rates, other than that we know that the entry requests are much better verified and controlled.”

The first phase of the ABN Amro workflow project was expected to be completed this month.

Minimizing Impact

Michael Jameson, global BCP manager at AXA Investment Managers, presented a talk entitled: ‘Minimizing the Impact of Business Exceptions on the Performance of Reference Data and Continuity.’ Jameson specializes in business recovery and heads information resources management at AXA.

Jameson said that the development of reference data recovery plans was essential to maintaining consistent, clean and accessible reference data. He outlined processes and plans to minimize the adverse impact of business exceptions; identified critical vs. non-critical data for facilitating optimal business recovery; assessed the importance of front- to back-office systems (re)synchronization; analyzed the role of external providers in an effective business continuity model; and presented plans to ensure data integrity and data distribution in business recovery situations.

Integrating Data Hubs

Roger Tan, head of reference data management at Robeco Asset Management, presented a case study from Robeco: ‘Effectively Integrating Two Separate Reference Data Hubs to Ensure You Manage and Distribute Consistent Data: How to Reduce Trade Failures and Downstream Application Management Costs.’

Tan’s case study featured a project at Robeco’s European head office in Rotterdam to build a new architecture that centralized data management activities into a single team. The set up used standardized methodologies and moves towards a simple information architecture for data management, Tan said. This environment can “achieve the benefits of consistent data in a multi-data-systems company,” he said.

Since joining Robeco, Tan has implemented a model for retrieving, cleansing and distributing benchmark data from a 10,000-security, daily-updated database.

Maintaining Reference Data

Cecilia Holden, vice president, enterprise data standards at Merrill Lynch, talked about ‘Developing the Business and Technology Principles to Efficiently Maintain Accurate and Consistent Reference Data.’

The presentation illustrated the use of technology to ensure the best principles of ownership and maintenance of reference data. Holden discussed the need not only for a centralized data architecture, but also for a centralised data management team to take responsibility and control over reference data.

This group would be responsible for responsible for accurate, factual, timely creation of reference data as well as for proactive maintenance. The group would be independent of all business units, accountable to all business groups equally, measured on accuracy, not STP, and be responsible for policing consistency of data across applications, according to Holden.

Such centralized data management requires centralized technology, said Holden. However, “centralization does not mean one huge database,” she added.

Rather, it’s a process by which a single golden source is defined for each data item, and technology and operations work as partners, said Holden. Data ownership is centralized through DMG operations and data maintenance is centralized through DMG technology.

“Centralization can only work if standards are set,” she said. Consumers view and access data through a series of standard services, and may not know where the data is actually stored, added Holden.

Data compliance

Kevin Bradshaw, global head of Enterprise Information Products at Reuters, gave a talk entitled: ‘Data Compliance – How to Survive.’ Firms that are in breach of their data supply contracts carry unnecessary opera-tional risk, he said, offering advice to managers on the identification of data compliance pitfalls and strategies for effective management of the problem.

“The reality is that unless contracts are well managed and there is a vigorously maintained link between contracts and operations, breakdowns likely to occur,” said Bradshaw.

According to Reuters, over 80% of customers are not operating within their contractual boundaries and over 85% were not aware that they had an exposure, said Bradshaw. Also, data compliance is more of an issue in Europe than in the U.S., due to tighter controls in the U.S., and less fragmentation and fewer boundaries to manage, he added.

Common areas of contractual breach include: usage of data beyond functional boundaries, usage beyond geographic boundaries and redistribution without appropriate rights.

Managing compliance

According to Bradshaw, the way to manage compliance issues is to check your contracts, audit your infrastruc-ture and be proactive in resolving any issues with the data vendors. Organized by WB Research, the FIMA 2003 Event was sponsored by Financial Technologies International, Fame, Reuters, Cicada, Soliton, Iverson, Standard & Poor’s, Asset Control, Factset, and the London Stock Exchange. It was endorsed by RDUG, FISD, ISITC and IPUG.

Related content


Recorded Webinar: Adverse media screening – how to cut exposure to criminal activity, from money laundering to human trafficking

Screening for adverse media coverage of counterparties presents an incredible opportunity for financial institutions to limit risk exposures and identify bad actors early. It is required by regulations such as the EU’s sixth Anti-Money Laundering Directive (AML 6), and is one of the most effective ways to steer clear of potential connections with sanctioned activity...


Blackmore Capital’s Collaboration with OTCfin Completes Integration of ESG Factors into Investment Process

Blackmore Capital, a Melbourne-based asset manager set up in 2018, and New York-based OTCfin have completed the integration of ESG factors with financial data for all Blackmore portfolios. By incorporating ESG factors into Blackmore’s investment process, OTCfin’s risk and regulatory reporting solution will help the asset manager’s team improve portfolio monitoring from both a financial...


TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...