The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Bank of America Merrill Lynch Focused on Building a Data Fabric to Support “Multiple Truths”, Says Dalglish

Bank Of America Merrill Lynch has moved with the times in terms of recalibrating the goal of its reference data project, thus the focus is no longer on merely establishing a single gold copy, rather it is on building a robust data fabric to support all of its downstream users’ requirements, according to Tom Dalglish, director and chief information architect at the firm. The project has also achieved the three Ms of a successful reference data project: management, money and mandate, he explains to Reference Data Review.

“In some sense the single golden copy era has passed and now there is more of a focus on building a data fabric that is able to cope with the business requirement for global distribution and multiple different output formats for downstream systems,” explains Dalglish. “Individual business lines need different data sets and firms must be able to deal with these client driven multiple truths and manage the raw data accordingly.”

Reference Data Review readers should be no strangers to this development, given the comments made by Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, at an industry conference last year on the death of “golden copy” as a term within his own institution. The increased focus on meeting downstream users’ requirements, whether internal or external, is paramount in the reference data industry at the moment and this has affected the approach to data management projects significantly. 

However, this development does not mean the death of golden copy as a concept (even if the term has fallen out of favour in some circles). Dalglish explains: “Golden copy engines surely still have their uses but individual data models may be somewhat less important: firms need a semantics-based repository and a discoverable metadata repository that allows users to navigate their reference data effectively.”

This focus still requires strong extract, transformation and loading (ETL) tools, a powerful database and rules engine, and some notion of enterprise data management (EDM), he explains, although the real key to success is still in the adoption of enterprise data by internal clients. “It is not sufficient to have a good data model if nobody wants to use it. The need for all reference data to be ubiquitous simply translates into providing easier access to entitled data,” he elaborates.

To this end, Bank of America Merrill Lynch’s data management focused team has received the funding and support it needs in order to be able to start down this road in earnest. “There is a real focus from senior management on the importance of getting the data architecture right for the business. Funding has been strong and it is widely recognised that reference and market data management needs to be a true horizontal across all business lines, much like a utility that is run within the firm,” says Dalglish. And he reckons his own institution is not alone in this: “Finally, many firms have been handed the mandate to fix reference data in advance of anticipated regulations.”

Since the crisis in 2008, Dalglish notes that there was a pulling back from spending on data management, but in 2010 there was a strong focus on this space because of the requirement to keep a closer track of counterparty risk exposure and meet new regulatory requirements. The ongoing ripples as a result of the fall of Lehman also, likely, had their part to play in raising the profile of counterparty risk, along with regulatory developments around Basel III and the like.

Dalglish reckons that the prevalent attitude is to treat data as a first class construct and address problems with legacy systems. “There are also a lot of new vendors getting on the data bandwagon and increasing the range of options out there; 2011 is likely to see this trend continue,” he adds.

On the flip side, the maturing of the EDM practice has meant that a number of the traditional EDM solution vendors are struggling for position, especially in the face of new competitors in the market. “It is likely that we will see further consolidation in this space,” contends Dalglish. “On the plus side, we have a much wider choice of vendors than ever before though it has become difficult to sort through so many products all claiming to be in the EDM space. There are some fascinating new vendors in the market including some which are providing corporate hierarchies along with strong data visualisation tools.”

It will certainly be interesting to see who emerges as the winners in the race to gain greater market share in the data management sector, as vendors up their game in light of the increasing number of regulatory driven projects getting the green light this year. Buzzwords such as data caching, near-time and the ability to be standards agnostic have been prevalent over recent months and are likely to gain more interest over 2011.

Related content


Recorded Webinar: Fighting fraud and financial crime with RegTech

Financial fraud and crime continue to escalate causing significant damage to companies, countries and the global economy despite enormous efforts by firms and organisations in the financial services sector to identify and expel bad actors. As these bad actors use increasingly sophisticated techniques to break into financial institutions and extract both money and data, so...


Confluence Outlines Development Plans Following Acquisition Agreement with Clearlake Capital Group

Clearlake Capital Group’s agreement to acquire a majority stake in Confluence provides the regulatory and data solutions provider with backing to accelerate global expansion, develop additional solutions, integrate emerging technologies, and continue down the acquisition trail. Clearlake will acquire Confluence from private equity firm TA Associates, which partnered with Confluence in 2018 and will continue...


Data Management Summit London

DMS London brings together the European data management community to explore the latest challenges, opportunities and data innovations facing sell side and buy side financial institutions.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...