The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Distributed Data Management: The Subtle Necessity in the Wake of the Crisis?

By Don DeLoach, CEO of Aleri Nobody would argue that these are interesting times. As we sit in the wake of the largest global financial crisis most of us have ever witnessed, we face market conditions like never before. A looming global recession seems clear, and the imperatives that come with that are becoming obvious. Banks, and companies in general, are taking action to cut the expense line wherever possible, knowing that they will have to do more with less. And yet, the move to electronic trading continues to accelerate, and the corresponding speed and complexity of the markets are on an irreversible trend.

The underlying infrastructure, which is already breaking under the load, will certainly have to adapt to the increase in volume and complexity. And in the wake of this crisis, there will surely be a clarion call, both in North America and Europe, for increased risk management applied to operations and increased requirements for regulatory oversight and reporting. It would seem that the investment required to keep pace would be an increase in systems and people, hardly in keeping with the overwhelming pressure to conserve cash and reduce expenses. In that light, distributed data management (DDM) offers a very valuable approach for achieving the right outcome. DDM may be a hybrid between two prevailing approaches to tackling the problem of consistent, high quality data across a firm. The centralised data management approach, sometimes known as the ‘golden copy’, is one method. This assumes you get one clean master copy of all of the key data in one place and service the silos throughout the firm. This is somewhat of a mothership approach, where the goals are indeed worthy, but execution of this approach is difficult. The enterprise data approach, sometimes known as a ‘golden model’, suggests a framework to which all participating silos comply, but the location and management of specific pieces of that data exist throughout the firm. This too can be difficult, but perhaps is more closely aligned to the concepts of DDM, which may emerge as somewhat of a workable variation of EDM. DDM is fundamentally an approach to deploying systems that seeks to synchronise, where appropriate, the information which exists in various standalone silo systems without forcing a firm-wide definition of all data. That said, it could allow a firm-wide definition of some data, such that the consistency needed across the firm can be addressed. Think of it as allowing each independent operation to accomplish what it needs to accomplish in the way it needs to act, yet linking together the information that needs to be linked. In banks and buy side firms, the idea of centralised data management utilising a golden copy of key data has been embraced well in concept, but large scale reference data initiatives have been difficult in terms of execution, where costs, schedule overruns, and internal political and turf battles prevent or at least significantly diminish the firm’s ability to meet the objectives of these projects. But the objectives of such projects, like a consistent view of key data needed to run the firm remains intact. Accurate and consistent high quality data plays a critical role in the quality of many of the participating silos, from pre-trade analysis to trade execution to order routing to confirmation, matching, P&L integration, and risk assessment. The need doesn’t go away, but distributed data management can play a key role, and perhaps a more pragmatic role in addressing those needs. And the need to address these issues has never been greater, nor has the need for pragmatic and cost effective solutions. Data volumes in the markets are only increasing at a greater rate, including trade volumes, quote volumes, and IOI volumes. The sheer amount of information is crushing. The electronification of the markets is a reality that is here to stay. The complexity of these markets, the lack of geographic borders, and the blazing speed in which they move are defining the competitive landscape. And yet, one of the main implications of this phenomenon is an increase in the magnitude and velocity of the associated risk. We have seen the volcanic explosion of the financial markets and will now live and respond in the aftermath. But our response must be in the context of these very high volume, high speed, borderless markets that have been building all along. This means that the infrastructure of a firm must not only handle these market conditions, but now must also accommodate the increased need to measure and manage risk as well as respond to the impending call for increased regulatory oversight. So to suggest firms will need to do much more with much less is in fact, a gross understatement. DDM will not answer all pain points. There is no magic answer. But we have seen complex event processing (CEP) emerge as a new modality in computing that can enable a response from many of the component systems across the trade lifecycle, along with the ability to scale to the current and perhaps future market conditions. CEP, and especially CEP solutions architected to maintain ‘state’, such that they can link into and understand the status of individual silo systems, can be a perfect enabling technology for DDM. A well thought out CEP based distributed data management strategy can link together various systems to provide a firm the ability to scale and manage the inter-relationships between silos in a cost effective manner. Moreover, as the regulatory mandates begin to surface, a CEP based DDM architecture should allow a firm to respond, without having to throw more money and people at the requirements. In other words, this approach should allow firms to do more with less, which seems to be a reasonable path to pursue at this point in time.

Related content


Upcoming Webinar: Managing LIBOR transition

Date: 20 April 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a...


A-Team Group Releases Regulatory Data Handbook 2020/2021

A-Team Group releases the eighth edition of its ‘best read’ publication, the Regulatory Data Handbook, today. The handbook is a ‘must have’ for capital markets participants, is free of charge, and profiles every regulation that impacts capital markets’ data management practices. Linda Coffman, executive vice president at The SmartStream Reference Data Utility, leads the handbook...


Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.


RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...