About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulatory Pressure is Compelling Investment in Data Quality, Says Bank of America Merrill Lynch’s Dalglish

Subscribe to our newsletter

Regulatory pressure is compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses, according to Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch. Developments such as the Office of Financial Research also represent the potential for collaboration between regulators and other parties from within the industry, he says.

“The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish. “Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data.” Quite a list of requirements given the siloed nature of most firms’ data infrastructures; no wonder investment is being earmarked by so many of these financial institutions.

Regulation is proving to be both a blessing and a curse to the data management function, according to Dalglish, who last year pointed to the intense scrutiny of the function by regulators as a challenge as well as an opportunity during a panel discussion. The regulatory spotlight has the ability to highlight any underlying inaccuracies that could potentially cause reputational damage to those caught out, which all adds up to more pressure on the data management function to perform. With investment comes great responsibility, after all.

Dalglish’s own firm is investing in its data fabric to be able to meet the requirements of its downstream users and various regulatory reports and he also recommends opening up lines of communication with the regulatory community. He reckons that a healthy industry dialogue on these subjects will help matters: “Although we need to be able to second guess the regulators we also need to be able to engage with them directly. It is important to maintain good relationships with both the regulators and data vendors.”

As for the most pressing data management items on the priority list, he indicates that governance remains a challenge within the industry, as reference data teams in many firms are still fighting silos and compensation models across lines of business. “Senior management of individual business lines may be less inclined to sign off spending on group-wide projects; they don’t necessarily want to pay for data quality for other lines of business,” he says. “Mainly, though, we have seen a broad convergence of purpose in the wake of the market turmoil of the past few years.”

Dalglish reckons the trick is to increase the efficiency and simplicity of reference data management and bring it under a good governance umbrella. “It has taken a financial crisis to underscore the importance of this endeavour, however,” he continues. “The cost benefits of consolidation of vendor data for example are more important in the current market than they used to be, as is improving data stewardship, which allows groups to be more business effective. Moreover, the rapid capability of on boarding new data sources centrally and distributing them ubiquitously is paramount for any enterprise capability.”

So, priorities are clearer as a result of regulatory pressure, but what of the standards discussions going on at the legislative table? The industry has until the end of January to respond to the Office of Financial Research’s legal entity proposals, which were published in the Federal Register at the end of November, for example.

Dalglish reckons the various ongoing symbology initiatives – including the various legal entity proposals by Swift, ISO and Avox, as well as Bloomberg’s Open Symbology discussions – are a step in the right direction but he feels that the passing of the Dodd Frank Act and the aggressive timelines specified therein for compliance is a more interesting development. “It is likely that we will have to adopt a hybrid (public and private sector) solution between the regulatory and vendor communities,” he contends.

“The plans for a regulatory backed reference data utility are likely to be thwarted by commercial concerns, intellectual property issues and warring over standards,” elaborates Dalglish. “If, for example, the ISIN is selected as an issue level identifier this may pose a challenge for the industry as it is not specific enough to track the entire related financial instruments required by the business for electronic trading. On the other hand, a proprietary identifier might confer unfair advantage to a particular vendor.” As frequently noted by Reference Data Review, when it comes to standards, the choice is not always clear cut.

Dalglish reckons instrument data standardisation will likely be easier than the challenge of establishing standards for client and counterparty data, mainly because there are so many vendors from which to buy financial securities (issue) data. “For legal entities there needs to be a data hierarchy that is able to show all the parent/child linkages and internal groupings and which must have immutable identifiers that never get re-used. We can also foresee a need to have flexible hierarchical containers that can be extended by institutions to support internal identifiers as well,” he adds.

The first half of 2011 will likely see tremendous activity in the regulatory space, as understanding the implications of the various mandates becomes clearer, according to Dalglish. “There has been a confluence of pivotal events for us in that regulators are demanding transparency, vendor capabilities are increasing and advances in software technologies have presented firms with a wide variety of choice. We may not be entirely sure where we are going, but we are on our way,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise the business value of your data using agile data governance

Data governance is transforming from a risk management and compliance tool with limited and prescriptive controls, to a solution that can help you optimise the business value of your data. In this role, data governance must scale to manage rising volumes of data, more and different data types, and changing user requirements, while continuing to...

BLOG

Know Your Customer Offers Company Data from Local Registries Across 123 Countries

Know Your Customer has released an expanded version of its Know your Customer/Know Your Business (KYC/KYB) solution that covers company data and official incorporation documents from 123 countries worldwide. The data can be consumed using either Know Your Customer’s user interface or a single API. The expanded service provides real-time access to local company registries...

EVENT

ESG Insight Briefing New York

The briefing will explore challenges around assembling and evaluating ESG data, how to apply new technologies to improve data quality and insight and the impact of regulatory measures on standardisation efforts.

GUIDE

Evaluated Pricing

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate...