About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Distributed Data Management: The Subtle Necessity in the Wake of the Crisis?

Subscribe to our newsletter

By Don DeLoach, CEO of Aleri Nobody would argue that these are interesting times. As we sit in the wake of the largest global financial crisis most of us have ever witnessed, we face market conditions like never before. A looming global recession seems clear, and the imperatives that come with that are becoming obvious. Banks, and companies in general, are taking action to cut the expense line wherever possible, knowing that they will have to do more with less. And yet, the move to electronic trading continues to accelerate, and the corresponding speed and complexity of the markets are on an irreversible trend.

The underlying infrastructure, which is already breaking under the load, will certainly have to adapt to the increase in volume and complexity. And in the wake of this crisis, there will surely be a clarion call, both in North America and Europe, for increased risk management applied to operations and increased requirements for regulatory oversight and reporting. It would seem that the investment required to keep pace would be an increase in systems and people, hardly in keeping with the overwhelming pressure to conserve cash and reduce expenses. In that light, distributed data management (DDM) offers a very valuable approach for achieving the right outcome. DDM may be a hybrid between two prevailing approaches to tackling the problem of consistent, high quality data across a firm. The centralised data management approach, sometimes known as the ‘golden copy’, is one method. This assumes you get one clean master copy of all of the key data in one place and service the silos throughout the firm. This is somewhat of a mothership approach, where the goals are indeed worthy, but execution of this approach is difficult. The enterprise data approach, sometimes known as a ‘golden model’, suggests a framework to which all participating silos comply, but the location and management of specific pieces of that data exist throughout the firm. This too can be difficult, but perhaps is more closely aligned to the concepts of DDM, which may emerge as somewhat of a workable variation of EDM. DDM is fundamentally an approach to deploying systems that seeks to synchronise, where appropriate, the information which exists in various standalone silo systems without forcing a firm-wide definition of all data. That said, it could allow a firm-wide definition of some data, such that the consistency needed across the firm can be addressed. Think of it as allowing each independent operation to accomplish what it needs to accomplish in the way it needs to act, yet linking together the information that needs to be linked. In banks and buy side firms, the idea of centralised data management utilising a golden copy of key data has been embraced well in concept, but large scale reference data initiatives have been difficult in terms of execution, where costs, schedule overruns, and internal political and turf battles prevent or at least significantly diminish the firm’s ability to meet the objectives of these projects. But the objectives of such projects, like a consistent view of key data needed to run the firm remains intact. Accurate and consistent high quality data plays a critical role in the quality of many of the participating silos, from pre-trade analysis to trade execution to order routing to confirmation, matching, P&L integration, and risk assessment. The need doesn’t go away, but distributed data management can play a key role, and perhaps a more pragmatic role in addressing those needs. And the need to address these issues has never been greater, nor has the need for pragmatic and cost effective solutions. Data volumes in the markets are only increasing at a greater rate, including trade volumes, quote volumes, and IOI volumes. The sheer amount of information is crushing. The electronification of the markets is a reality that is here to stay. The complexity of these markets, the lack of geographic borders, and the blazing speed in which they move are defining the competitive landscape. And yet, one of the main implications of this phenomenon is an increase in the magnitude and velocity of the associated risk. We have seen the volcanic explosion of the financial markets and will now live and respond in the aftermath. But our response must be in the context of these very high volume, high speed, borderless markets that have been building all along. This means that the infrastructure of a firm must not only handle these market conditions, but now must also accommodate the increased need to measure and manage risk as well as respond to the impending call for increased regulatory oversight. So to suggest firms will need to do much more with much less is in fact, a gross understatement. DDM will not answer all pain points. There is no magic answer. But we have seen complex event processing (CEP) emerge as a new modality in computing that can enable a response from many of the component systems across the trade lifecycle, along with the ability to scale to the current and perhaps future market conditions. CEP, and especially CEP solutions architected to maintain ‘state’, such that they can link into and understand the status of individual silo systems, can be a perfect enabling technology for DDM. A well thought out CEP based distributed data management strategy can link together various systems to provide a firm the ability to scale and manage the inter-relationships between silos in a cost effective manner. Moreover, as the regulatory mandates begin to surface, a CEP based DDM architecture should allow a firm to respond, without having to throw more money and people at the requirements. In other words, this approach should allow firms to do more with less, which seems to be a reasonable path to pursue at this point in time.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Strong Governance, Privacy Policies Can Negate AI Risks, Informatica Says

Debate about the limitations of artificial intelligence (AI) in data management was stoked further this week when a leading vendor warned that applications built on nascent large language model (LLM) technology could pose an “existential threat” to companies if not deployed thoughtfully. Jason du Preez, vice president of privacy and security at cloud data management...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...