About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Bank of America Merrill Lynch Focused on Building a Data Fabric to Support “Multiple Truths”, Says Dalglish

Subscribe to our newsletter

Bank Of America Merrill Lynch has moved with the times in terms of recalibrating the goal of its reference data project, thus the focus is no longer on merely establishing a single gold copy, rather it is on building a robust data fabric to support all of its downstream users’ requirements, according to Tom Dalglish, director and chief information architect at the firm. The project has also achieved the three Ms of a successful reference data project: management, money and mandate, he explains to Reference Data Review.

“In some sense the single golden copy era has passed and now there is more of a focus on building a data fabric that is able to cope with the business requirement for global distribution and multiple different output formats for downstream systems,” explains Dalglish. “Individual business lines need different data sets and firms must be able to deal with these client driven multiple truths and manage the raw data accordingly.”

Reference Data Review readers should be no strangers to this development, given the comments made by Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, at an industry conference last year on the death of “golden copy” as a term within his own institution. The increased focus on meeting downstream users’ requirements, whether internal or external, is paramount in the reference data industry at the moment and this has affected the approach to data management projects significantly. 

However, this development does not mean the death of golden copy as a concept (even if the term has fallen out of favour in some circles). Dalglish explains: “Golden copy engines surely still have their uses but individual data models may be somewhat less important: firms need a semantics-based repository and a discoverable metadata repository that allows users to navigate their reference data effectively.”

This focus still requires strong extract, transformation and loading (ETL) tools, a powerful database and rules engine, and some notion of enterprise data management (EDM), he explains, although the real key to success is still in the adoption of enterprise data by internal clients. “It is not sufficient to have a good data model if nobody wants to use it. The need for all reference data to be ubiquitous simply translates into providing easier access to entitled data,” he elaborates.

To this end, Bank of America Merrill Lynch’s data management focused team has received the funding and support it needs in order to be able to start down this road in earnest. “There is a real focus from senior management on the importance of getting the data architecture right for the business. Funding has been strong and it is widely recognised that reference and market data management needs to be a true horizontal across all business lines, much like a utility that is run within the firm,” says Dalglish. And he reckons his own institution is not alone in this: “Finally, many firms have been handed the mandate to fix reference data in advance of anticipated regulations.”

Since the crisis in 2008, Dalglish notes that there was a pulling back from spending on data management, but in 2010 there was a strong focus on this space because of the requirement to keep a closer track of counterparty risk exposure and meet new regulatory requirements. The ongoing ripples as a result of the fall of Lehman also, likely, had their part to play in raising the profile of counterparty risk, along with regulatory developments around Basel III and the like.

Dalglish reckons that the prevalent attitude is to treat data as a first class construct and address problems with legacy systems. “There are also a lot of new vendors getting on the data bandwagon and increasing the range of options out there; 2011 is likely to see this trend continue,” he adds.

On the flip side, the maturing of the EDM practice has meant that a number of the traditional EDM solution vendors are struggling for position, especially in the face of new competitors in the market. “It is likely that we will see further consolidation in this space,” contends Dalglish. “On the plus side, we have a much wider choice of vendors than ever before though it has become difficult to sort through so many products all claiming to be in the EDM space. There are some fascinating new vendors in the market including some which are providing corporate hierarchies along with strong data visualisation tools.”

It will certainly be interesting to see who emerges as the winners in the race to gain greater market share in the data management sector, as vendors up their game in light of the increasing number of regulatory driven projects getting the green light this year. Buzzwords such as data caching, near-time and the ability to be standards agnostic have been prevalent over recent months and are likely to gain more interest over 2011.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Addressing conduct risk: approaches to surveillance

Conduct risk in financial services is a critical area that requires vigilant monitoring and robust surveillance mechanisms. Regulatory bodies, (FCA, FINRA and others) have tightened their scrutiny and financial institutions must adopt advanced approaches to effectively manage and mitigate conduct risk. This webinar will examine the latest methodologies and technologies used to address conduct risk,...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...