About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Bank of America Merrill Lynch Focused on Building a Data Fabric to Support “Multiple Truths”, Says Dalglish

Subscribe to our newsletter

Bank Of America Merrill Lynch has moved with the times in terms of recalibrating the goal of its reference data project, thus the focus is no longer on merely establishing a single gold copy, rather it is on building a robust data fabric to support all of its downstream users’ requirements, according to Tom Dalglish, director and chief information architect at the firm. The project has also achieved the three Ms of a successful reference data project: management, money and mandate, he explains to Reference Data Review.

“In some sense the single golden copy era has passed and now there is more of a focus on building a data fabric that is able to cope with the business requirement for global distribution and multiple different output formats for downstream systems,” explains Dalglish. “Individual business lines need different data sets and firms must be able to deal with these client driven multiple truths and manage the raw data accordingly.”

Reference Data Review readers should be no strangers to this development, given the comments made by Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, at an industry conference last year on the death of “golden copy” as a term within his own institution. The increased focus on meeting downstream users’ requirements, whether internal or external, is paramount in the reference data industry at the moment and this has affected the approach to data management projects significantly. 

However, this development does not mean the death of golden copy as a concept (even if the term has fallen out of favour in some circles). Dalglish explains: “Golden copy engines surely still have their uses but individual data models may be somewhat less important: firms need a semantics-based repository and a discoverable metadata repository that allows users to navigate their reference data effectively.”

This focus still requires strong extract, transformation and loading (ETL) tools, a powerful database and rules engine, and some notion of enterprise data management (EDM), he explains, although the real key to success is still in the adoption of enterprise data by internal clients. “It is not sufficient to have a good data model if nobody wants to use it. The need for all reference data to be ubiquitous simply translates into providing easier access to entitled data,” he elaborates.

To this end, Bank of America Merrill Lynch’s data management focused team has received the funding and support it needs in order to be able to start down this road in earnest. “There is a real focus from senior management on the importance of getting the data architecture right for the business. Funding has been strong and it is widely recognised that reference and market data management needs to be a true horizontal across all business lines, much like a utility that is run within the firm,” says Dalglish. And he reckons his own institution is not alone in this: “Finally, many firms have been handed the mandate to fix reference data in advance of anticipated regulations.”

Since the crisis in 2008, Dalglish notes that there was a pulling back from spending on data management, but in 2010 there was a strong focus on this space because of the requirement to keep a closer track of counterparty risk exposure and meet new regulatory requirements. The ongoing ripples as a result of the fall of Lehman also, likely, had their part to play in raising the profile of counterparty risk, along with regulatory developments around Basel III and the like.

Dalglish reckons that the prevalent attitude is to treat data as a first class construct and address problems with legacy systems. “There are also a lot of new vendors getting on the data bandwagon and increasing the range of options out there; 2011 is likely to see this trend continue,” he adds.

On the flip side, the maturing of the EDM practice has meant that a number of the traditional EDM solution vendors are struggling for position, especially in the face of new competitors in the market. “It is likely that we will see further consolidation in this space,” contends Dalglish. “On the plus side, we have a much wider choice of vendors than ever before though it has become difficult to sort through so many products all claiming to be in the EDM space. There are some fascinating new vendors in the market including some which are providing corporate hierarchies along with strong data visualisation tools.”

It will certainly be interesting to see who emerges as the winners in the race to gain greater market share in the data management sector, as vendors up their game in light of the increasing number of regulatory driven projects getting the green light this year. Buzzwords such as data caching, near-time and the ability to be standards agnostic have been prevalent over recent months and are likely to gain more interest over 2011.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

FactSet Expands GenAI Offerings with Portfolio Commentary

FactSet, provider of a financial digital platform and enterprise solutions, has expanded its GenAI offerings with Portfolio Commentary. The solution complements the company’s performance attribution capabilities with detailed and source-linked commentary allowing buy-side and wealth managers to understand key drivers of portfolio performance more holistically. Powered by large language models (LLMs), Portfolio Commentary also reduces...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...