About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Distributed Data Management: The Subtle Necessity in the Wake of the Crisis?

Subscribe to our newsletter

By Don DeLoach, CEO of Aleri Nobody would argue that these are interesting times. As we sit in the wake of the largest global financial crisis most of us have ever witnessed, we face market conditions like never before. A looming global recession seems clear, and the imperatives that come with that are becoming obvious. Banks, and companies in general, are taking action to cut the expense line wherever possible, knowing that they will have to do more with less. And yet, the move to electronic trading continues to accelerate, and the corresponding speed and complexity of the markets are on an irreversible trend.

The underlying infrastructure, which is already breaking under the load, will certainly have to adapt to the increase in volume and complexity. And in the wake of this crisis, there will surely be a clarion call, both in North America and Europe, for increased risk management applied to operations and increased requirements for regulatory oversight and reporting. It would seem that the investment required to keep pace would be an increase in systems and people, hardly in keeping with the overwhelming pressure to conserve cash and reduce expenses. In that light, distributed data management (DDM) offers a very valuable approach for achieving the right outcome. DDM may be a hybrid between two prevailing approaches to tackling the problem of consistent, high quality data across a firm. The centralised data management approach, sometimes known as the ‘golden copy’, is one method. This assumes you get one clean master copy of all of the key data in one place and service the silos throughout the firm. This is somewhat of a mothership approach, where the goals are indeed worthy, but execution of this approach is difficult. The enterprise data approach, sometimes known as a ‘golden model’, suggests a framework to which all participating silos comply, but the location and management of specific pieces of that data exist throughout the firm. This too can be difficult, but perhaps is more closely aligned to the concepts of DDM, which may emerge as somewhat of a workable variation of EDM. DDM is fundamentally an approach to deploying systems that seeks to synchronise, where appropriate, the information which exists in various standalone silo systems without forcing a firm-wide definition of all data. That said, it could allow a firm-wide definition of some data, such that the consistency needed across the firm can be addressed. Think of it as allowing each independent operation to accomplish what it needs to accomplish in the way it needs to act, yet linking together the information that needs to be linked. In banks and buy side firms, the idea of centralised data management utilising a golden copy of key data has been embraced well in concept, but large scale reference data initiatives have been difficult in terms of execution, where costs, schedule overruns, and internal political and turf battles prevent or at least significantly diminish the firm’s ability to meet the objectives of these projects. But the objectives of such projects, like a consistent view of key data needed to run the firm remains intact. Accurate and consistent high quality data plays a critical role in the quality of many of the participating silos, from pre-trade analysis to trade execution to order routing to confirmation, matching, P&L integration, and risk assessment. The need doesn’t go away, but distributed data management can play a key role, and perhaps a more pragmatic role in addressing those needs. And the need to address these issues has never been greater, nor has the need for pragmatic and cost effective solutions. Data volumes in the markets are only increasing at a greater rate, including trade volumes, quote volumes, and IOI volumes. The sheer amount of information is crushing. The electronification of the markets is a reality that is here to stay. The complexity of these markets, the lack of geographic borders, and the blazing speed in which they move are defining the competitive landscape. And yet, one of the main implications of this phenomenon is an increase in the magnitude and velocity of the associated risk. We have seen the volcanic explosion of the financial markets and will now live and respond in the aftermath. But our response must be in the context of these very high volume, high speed, borderless markets that have been building all along. This means that the infrastructure of a firm must not only handle these market conditions, but now must also accommodate the increased need to measure and manage risk as well as respond to the impending call for increased regulatory oversight. So to suggest firms will need to do much more with much less is in fact, a gross understatement. DDM will not answer all pain points. There is no magic answer. But we have seen complex event processing (CEP) emerge as a new modality in computing that can enable a response from many of the component systems across the trade lifecycle, along with the ability to scale to the current and perhaps future market conditions. CEP, and especially CEP solutions architected to maintain ‘state’, such that they can link into and understand the status of individual silo systems, can be a perfect enabling technology for DDM. A well thought out CEP based distributed data management strategy can link together various systems to provide a firm the ability to scale and manage the inter-relationships between silos in a cost effective manner. Moreover, as the regulatory mandates begin to surface, a CEP based DDM architecture should allow a firm to respond, without having to throw more money and people at the requirements. In other words, this approach should allow firms to do more with less, which seems to be a reasonable path to pursue at this point in time.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

8 October 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be...

BLOG

Salesforce to Buy Informatica, Betting on ‘Switzerland of Data’ to Drive AI

Data management giant Informatica is to be acquired by Salesforce in a deal valued at US$8 billion, giving the CRM behemoth a cloud-based data business on which to further build its artificial intelligence ambitions. The California-based companies entered into an agreement for the deal, which will see Salesforce buy all the Informatica stock it doesn’t...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...