The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Bank of America Merrill Lynch Focused on Building a Data Fabric to Support “Multiple Truths”, Says Dalglish

Bank Of America Merrill Lynch has moved with the times in terms of recalibrating the goal of its reference data project, thus the focus is no longer on merely establishing a single gold copy, rather it is on building a robust data fabric to support all of its downstream users’ requirements, according to Tom Dalglish, director and chief information architect at the firm. The project has also achieved the three Ms of a successful reference data project: management, money and mandate, he explains to Reference Data Review.

“In some sense the single golden copy era has passed and now there is more of a focus on building a data fabric that is able to cope with the business requirement for global distribution and multiple different output formats for downstream systems,” explains Dalglish. “Individual business lines need different data sets and firms must be able to deal with these client driven multiple truths and manage the raw data accordingly.”

Reference Data Review readers should be no strangers to this development, given the comments made by Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, at an industry conference last year on the death of “golden copy” as a term within his own institution. The increased focus on meeting downstream users’ requirements, whether internal or external, is paramount in the reference data industry at the moment and this has affected the approach to data management projects significantly. 

However, this development does not mean the death of golden copy as a concept (even if the term has fallen out of favour in some circles). Dalglish explains: “Golden copy engines surely still have their uses but individual data models may be somewhat less important: firms need a semantics-based repository and a discoverable metadata repository that allows users to navigate their reference data effectively.”

This focus still requires strong extract, transformation and loading (ETL) tools, a powerful database and rules engine, and some notion of enterprise data management (EDM), he explains, although the real key to success is still in the adoption of enterprise data by internal clients. “It is not sufficient to have a good data model if nobody wants to use it. The need for all reference data to be ubiquitous simply translates into providing easier access to entitled data,” he elaborates.

To this end, Bank of America Merrill Lynch’s data management focused team has received the funding and support it needs in order to be able to start down this road in earnest. “There is a real focus from senior management on the importance of getting the data architecture right for the business. Funding has been strong and it is widely recognised that reference and market data management needs to be a true horizontal across all business lines, much like a utility that is run within the firm,” says Dalglish. And he reckons his own institution is not alone in this: “Finally, many firms have been handed the mandate to fix reference data in advance of anticipated regulations.”

Since the crisis in 2008, Dalglish notes that there was a pulling back from spending on data management, but in 2010 there was a strong focus on this space because of the requirement to keep a closer track of counterparty risk exposure and meet new regulatory requirements. The ongoing ripples as a result of the fall of Lehman also, likely, had their part to play in raising the profile of counterparty risk, along with regulatory developments around Basel III and the like.

Dalglish reckons that the prevalent attitude is to treat data as a first class construct and address problems with legacy systems. “There are also a lot of new vendors getting on the data bandwagon and increasing the range of options out there; 2011 is likely to see this trend continue,” he adds.

On the flip side, the maturing of the EDM practice has meant that a number of the traditional EDM solution vendors are struggling for position, especially in the face of new competitors in the market. “It is likely that we will see further consolidation in this space,” contends Dalglish. “On the plus side, we have a much wider choice of vendors than ever before though it has become difficult to sort through so many products all claiming to be in the EDM space. There are some fascinating new vendors in the market including some which are providing corporate hierarchies along with strong data visualisation tools.”

It will certainly be interesting to see who emerges as the winners in the race to gain greater market share in the data management sector, as vendors up their game in light of the increasing number of regulatory driven projects getting the green light this year. Buzzwords such as data caching, near-time and the ability to be standards agnostic have been prevalent over recent months and are likely to gain more interest over 2011.

Related content


Recorded Webinar: Fighting fraud and financial crime with RegTech

Financial fraud and crime continue to escalate causing significant damage to companies, countries and the global economy despite enormous efforts by firms and organisations in the financial services sector to identify and expel bad actors. As these bad actors use increasingly sophisticated techniques to break into financial institutions and extract both money and data, so...


Can You Prove Risk-Based e-Comms Surveillance Outcomes Have Merit?

By Robert Houghton, Founder, CTO and Technical Evangelist at Insightful Technology. The elephant in the room whenever a discussion turns to risk-based surveillance is the fact that many Compliance Officers do not have complete confidence in their data. There are currently no parameters from the regulators of the financial sector to permit risk-based surveillance, yet...


TradingTech Summit Virtual (Redirected)

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...