About a-team Marketing Services

A-Team Insight Blogs

Regulatory Pressure is Compelling Investment in Data Quality, Says Bank of America Merrill Lynch’s Dalglish

Subscribe to our newsletter

Regulatory pressure is compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses, according to Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch. Developments such as the Office of Financial Research also represent the potential for collaboration between regulators and other parties from within the industry, he says.

“The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish. “Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data.” Quite a list of requirements given the siloed nature of most firms’ data infrastructures; no wonder investment is being earmarked by so many of these financial institutions.

Regulation is proving to be both a blessing and a curse to the data management function, according to Dalglish, who last year pointed to the intense scrutiny of the function by regulators as a challenge as well as an opportunity during a panel discussion. The regulatory spotlight has the ability to highlight any underlying inaccuracies that could potentially cause reputational damage to those caught out, which all adds up to more pressure on the data management function to perform. With investment comes great responsibility, after all.

Dalglish’s own firm is investing in its data fabric to be able to meet the requirements of its downstream users and various regulatory reports and he also recommends opening up lines of communication with the regulatory community. He reckons that a healthy industry dialogue on these subjects will help matters: “Although we need to be able to second guess the regulators we also need to be able to engage with them directly. It is important to maintain good relationships with both the regulators and data vendors.”

As for the most pressing data management items on the priority list, he indicates that governance remains a challenge within the industry, as reference data teams in many firms are still fighting silos and compensation models across lines of business. “Senior management of individual business lines may be less inclined to sign off spending on group-wide projects; they don’t necessarily want to pay for data quality for other lines of business,” he says. “Mainly, though, we have seen a broad convergence of purpose in the wake of the market turmoil of the past few years.”

Dalglish reckons the trick is to increase the efficiency and simplicity of reference data management and bring it under a good governance umbrella. “It has taken a financial crisis to underscore the importance of this endeavour, however,” he continues. “The cost benefits of consolidation of vendor data for example are more important in the current market than they used to be, as is improving data stewardship, which allows groups to be more business effective. Moreover, the rapid capability of on boarding new data sources centrally and distributing them ubiquitously is paramount for any enterprise capability.”

So, priorities are clearer as a result of regulatory pressure, but what of the standards discussions going on at the legislative table? The industry has until the end of January to respond to the Office of Financial Research’s legal entity proposals, which were published in the Federal Register at the end of November, for example.

Dalglish reckons the various ongoing symbology initiatives – including the various legal entity proposals by Swift, ISO and Avox, as well as Bloomberg’s Open Symbology discussions – are a step in the right direction but he feels that the passing of the Dodd Frank Act and the aggressive timelines specified therein for compliance is a more interesting development. “It is likely that we will have to adopt a hybrid (public and private sector) solution between the regulatory and vendor communities,” he contends.

“The plans for a regulatory backed reference data utility are likely to be thwarted by commercial concerns, intellectual property issues and warring over standards,” elaborates Dalglish. “If, for example, the ISIN is selected as an issue level identifier this may pose a challenge for the industry as it is not specific enough to track the entire related financial instruments required by the business for electronic trading. On the other hand, a proprietary identifier might confer unfair advantage to a particular vendor.” As frequently noted by Reference Data Review, when it comes to standards, the choice is not always clear cut.

Dalglish reckons instrument data standardisation will likely be easier than the challenge of establishing standards for client and counterparty data, mainly because there are so many vendors from which to buy financial securities (issue) data. “For legal entities there needs to be a data hierarchy that is able to show all the parent/child linkages and internal groupings and which must have immutable identifiers that never get re-used. We can also foresee a need to have flexible hierarchical containers that can be extended by institutions to support internal identifiers as well,” he adds.

The first half of 2011 will likely see tremendous activity in the regulatory space, as understanding the implications of the various mandates becomes clearer, according to Dalglish. “There has been a confluence of pivotal events for us in that regulators are demanding transparency, vendor capabilities are increasing and advances in software technologies have presented firms with a wide variety of choice. We may not be entirely sure where we are going, but we are on our way,” he concludes.

Subscribe to our newsletter

Related content


Recorded Webinar: Best practices for compliance with EU Market Abuse Regulation

EU Market Abuse Regulation (MAR) came into force in July 2016, rescinding the previous Market Abuse Directive and replacing it with a significantly extended scope of regulatory obligations. Eight years later, and amid constant change in capital markets regulation, technology and culture, financial institutions continue to struggle to stay on the right side of the...


SS&C and Regnology Collaborate to Ease Risk and Regulatory Reporting

SS&C Technologies, a provider of financial software and services, has made a strategic partnership with Regnology, a regulatory reporting and data company, to help clients meet upcoming regulatory requirements from the European Banking Authority (EBA). The collaboration is building a joint solution developed by SS&C Algorithmics and Regnology that addresses a wide range of needs...


Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...