About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulatory Pressure is Compelling Investment in Data Quality, Says Bank of America Merrill Lynch’s Dalglish

Subscribe to our newsletter

Regulatory pressure is compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses, according to Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch. Developments such as the Office of Financial Research also represent the potential for collaboration between regulators and other parties from within the industry, he says.

“The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish. “Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data.” Quite a list of requirements given the siloed nature of most firms’ data infrastructures; no wonder investment is being earmarked by so many of these financial institutions.

Regulation is proving to be both a blessing and a curse to the data management function, according to Dalglish, who last year pointed to the intense scrutiny of the function by regulators as a challenge as well as an opportunity during a panel discussion. The regulatory spotlight has the ability to highlight any underlying inaccuracies that could potentially cause reputational damage to those caught out, which all adds up to more pressure on the data management function to perform. With investment comes great responsibility, after all.

Dalglish’s own firm is investing in its data fabric to be able to meet the requirements of its downstream users and various regulatory reports and he also recommends opening up lines of communication with the regulatory community. He reckons that a healthy industry dialogue on these subjects will help matters: “Although we need to be able to second guess the regulators we also need to be able to engage with them directly. It is important to maintain good relationships with both the regulators and data vendors.”

As for the most pressing data management items on the priority list, he indicates that governance remains a challenge within the industry, as reference data teams in many firms are still fighting silos and compensation models across lines of business. “Senior management of individual business lines may be less inclined to sign off spending on group-wide projects; they don’t necessarily want to pay for data quality for other lines of business,” he says. “Mainly, though, we have seen a broad convergence of purpose in the wake of the market turmoil of the past few years.”

Dalglish reckons the trick is to increase the efficiency and simplicity of reference data management and bring it under a good governance umbrella. “It has taken a financial crisis to underscore the importance of this endeavour, however,” he continues. “The cost benefits of consolidation of vendor data for example are more important in the current market than they used to be, as is improving data stewardship, which allows groups to be more business effective. Moreover, the rapid capability of on boarding new data sources centrally and distributing them ubiquitously is paramount for any enterprise capability.”

So, priorities are clearer as a result of regulatory pressure, but what of the standards discussions going on at the legislative table? The industry has until the end of January to respond to the Office of Financial Research’s legal entity proposals, which were published in the Federal Register at the end of November, for example.

Dalglish reckons the various ongoing symbology initiatives – including the various legal entity proposals by Swift, ISO and Avox, as well as Bloomberg’s Open Symbology discussions – are a step in the right direction but he feels that the passing of the Dodd Frank Act and the aggressive timelines specified therein for compliance is a more interesting development. “It is likely that we will have to adopt a hybrid (public and private sector) solution between the regulatory and vendor communities,” he contends.

“The plans for a regulatory backed reference data utility are likely to be thwarted by commercial concerns, intellectual property issues and warring over standards,” elaborates Dalglish. “If, for example, the ISIN is selected as an issue level identifier this may pose a challenge for the industry as it is not specific enough to track the entire related financial instruments required by the business for electronic trading. On the other hand, a proprietary identifier might confer unfair advantage to a particular vendor.” As frequently noted by Reference Data Review, when it comes to standards, the choice is not always clear cut.

Dalglish reckons instrument data standardisation will likely be easier than the challenge of establishing standards for client and counterparty data, mainly because there are so many vendors from which to buy financial securities (issue) data. “For legal entities there needs to be a data hierarchy that is able to show all the parent/child linkages and internal groupings and which must have immutable identifiers that never get re-used. We can also foresee a need to have flexible hierarchical containers that can be extended by institutions to support internal identifiers as well,” he adds.

The first half of 2011 will likely see tremendous activity in the regulatory space, as understanding the implications of the various mandates becomes clearer, according to Dalglish. “There has been a confluence of pivotal events for us in that regulators are demanding transparency, vendor capabilities are increasing and advances in software technologies have presented firms with a wide variety of choice. We may not be entirely sure where we are going, but we are on our way,” he concludes.

Subscribe to our newsletter

Related content


Recorded Webinar: How to apply innovative e-comms surveillance whilst ensuring control, compliance and enhanced productivity

Remember the days when email was the predominant media for electronic communications within and among financial institutions? Fast forward to today, and email represents a declining fraction of these e-comms, many of which are hosted by modern collaborative platforms such as Microsoft Teams, Webex, Slack, and Zoom, and all of which are subject to surveillance....


It’s Time to Embrace Risk Profiling for Regulatory Compliance

By Richard Pike, Managing Director, KYR Solutions, MyComplianceOffice. Regulations, frameworks, policies and controls define the day-to-day of Chief Compliance Officers (CCO) and their teams in what can best be described as a world of monitoring spaghetti. At the same time, the teams also need to ensure they are keeping senior executives and the front office...


RegTech Summit London

Now in its 6th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...