The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulatory Pressure is Compelling Investment in Data Quality, Says Bank of America Merrill Lynch’s Dalglish

Share article

Regulatory pressure is compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses, according to Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch. Developments such as the Office of Financial Research also represent the potential for collaboration between regulators and other parties from within the industry, he says.

“The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish. “Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data.” Quite a list of requirements given the siloed nature of most firms’ data infrastructures; no wonder investment is being earmarked by so many of these financial institutions.

Regulation is proving to be both a blessing and a curse to the data management function, according to Dalglish, who last year pointed to the intense scrutiny of the function by regulators as a challenge as well as an opportunity during a panel discussion. The regulatory spotlight has the ability to highlight any underlying inaccuracies that could potentially cause reputational damage to those caught out, which all adds up to more pressure on the data management function to perform. With investment comes great responsibility, after all.

Dalglish’s own firm is investing in its data fabric to be able to meet the requirements of its downstream users and various regulatory reports and he also recommends opening up lines of communication with the regulatory community. He reckons that a healthy industry dialogue on these subjects will help matters: “Although we need to be able to second guess the regulators we also need to be able to engage with them directly. It is important to maintain good relationships with both the regulators and data vendors.”

As for the most pressing data management items on the priority list, he indicates that governance remains a challenge within the industry, as reference data teams in many firms are still fighting silos and compensation models across lines of business. “Senior management of individual business lines may be less inclined to sign off spending on group-wide projects; they don’t necessarily want to pay for data quality for other lines of business,” he says. “Mainly, though, we have seen a broad convergence of purpose in the wake of the market turmoil of the past few years.”

Dalglish reckons the trick is to increase the efficiency and simplicity of reference data management and bring it under a good governance umbrella. “It has taken a financial crisis to underscore the importance of this endeavour, however,” he continues. “The cost benefits of consolidation of vendor data for example are more important in the current market than they used to be, as is improving data stewardship, which allows groups to be more business effective. Moreover, the rapid capability of on boarding new data sources centrally and distributing them ubiquitously is paramount for any enterprise capability.”

So, priorities are clearer as a result of regulatory pressure, but what of the standards discussions going on at the legislative table? The industry has until the end of January to respond to the Office of Financial Research’s legal entity proposals, which were published in the Federal Register at the end of November, for example.

Dalglish reckons the various ongoing symbology initiatives – including the various legal entity proposals by Swift, ISO and Avox, as well as Bloomberg’s Open Symbology discussions – are a step in the right direction but he feels that the passing of the Dodd Frank Act and the aggressive timelines specified therein for compliance is a more interesting development. “It is likely that we will have to adopt a hybrid (public and private sector) solution between the regulatory and vendor communities,” he contends.

“The plans for a regulatory backed reference data utility are likely to be thwarted by commercial concerns, intellectual property issues and warring over standards,” elaborates Dalglish. “If, for example, the ISIN is selected as an issue level identifier this may pose a challenge for the industry as it is not specific enough to track the entire related financial instruments required by the business for electronic trading. On the other hand, a proprietary identifier might confer unfair advantage to a particular vendor.” As frequently noted by Reference Data Review, when it comes to standards, the choice is not always clear cut.

Dalglish reckons instrument data standardisation will likely be easier than the challenge of establishing standards for client and counterparty data, mainly because there are so many vendors from which to buy financial securities (issue) data. “For legal entities there needs to be a data hierarchy that is able to show all the parent/child linkages and internal groupings and which must have immutable identifiers that never get re-used. We can also foresee a need to have flexible hierarchical containers that can be extended by institutions to support internal identifiers as well,” he adds.

The first half of 2011 will likely see tremendous activity in the regulatory space, as understanding the implications of the various mandates becomes clearer, according to Dalglish. “There has been a confluence of pivotal events for us in that regulators are demanding transparency, vendor capabilities are increasing and advances in software technologies have presented firms with a wide variety of choice. We may not be entirely sure where we are going, but we are on our way,” he concludes.

Related content

WEBINAR

Recorded Webinar: Overcoming the Barriers to Implementing RegTech Solutions: The View from Either Side of the Fence

RegTech holds the promise of targeted, agile and often low-cost solutions to the real-world problems faced by financial institutions across the board. So why is it so difficult to get RegTech projects off the ground? RegTech solutions providers complain that it’s difficult to get access to decision-makers, and even when they do it’s tough to...

BLOG

How a Great Private Bank Can Be Greater Still with the Right CRM

By Alessandro Tonchia, Co-Founder and Head of Strategy at Finantix The best relationship managers are proactive with their clients. Not only do they know the client history and have to hand everything a client asks for or currently requires, they also anticipate requests, and can even point their clients to interesting, relevant information that might...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that will be held in May 2021 with an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...