About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Distributed Data Management: The Subtle Necessity in the Wake of the Crisis?

Subscribe to our newsletter

By Don DeLoach, CEO of Aleri Nobody would argue that these are interesting times. As we sit in the wake of the largest global financial crisis most of us have ever witnessed, we face market conditions like never before. A looming global recession seems clear, and the imperatives that come with that are becoming obvious. Banks, and companies in general, are taking action to cut the expense line wherever possible, knowing that they will have to do more with less. And yet, the move to electronic trading continues to accelerate, and the corresponding speed and complexity of the markets are on an irreversible trend.

The underlying infrastructure, which is already breaking under the load, will certainly have to adapt to the increase in volume and complexity. And in the wake of this crisis, there will surely be a clarion call, both in North America and Europe, for increased risk management applied to operations and increased requirements for regulatory oversight and reporting. It would seem that the investment required to keep pace would be an increase in systems and people, hardly in keeping with the overwhelming pressure to conserve cash and reduce expenses. In that light, distributed data management (DDM) offers a very valuable approach for achieving the right outcome. DDM may be a hybrid between two prevailing approaches to tackling the problem of consistent, high quality data across a firm. The centralised data management approach, sometimes known as the ‘golden copy’, is one method. This assumes you get one clean master copy of all of the key data in one place and service the silos throughout the firm. This is somewhat of a mothership approach, where the goals are indeed worthy, but execution of this approach is difficult. The enterprise data approach, sometimes known as a ‘golden model’, suggests a framework to which all participating silos comply, but the location and management of specific pieces of that data exist throughout the firm. This too can be difficult, but perhaps is more closely aligned to the concepts of DDM, which may emerge as somewhat of a workable variation of EDM. DDM is fundamentally an approach to deploying systems that seeks to synchronise, where appropriate, the information which exists in various standalone silo systems without forcing a firm-wide definition of all data. That said, it could allow a firm-wide definition of some data, such that the consistency needed across the firm can be addressed. Think of it as allowing each independent operation to accomplish what it needs to accomplish in the way it needs to act, yet linking together the information that needs to be linked. In banks and buy side firms, the idea of centralised data management utilising a golden copy of key data has been embraced well in concept, but large scale reference data initiatives have been difficult in terms of execution, where costs, schedule overruns, and internal political and turf battles prevent or at least significantly diminish the firm’s ability to meet the objectives of these projects. But the objectives of such projects, like a consistent view of key data needed to run the firm remains intact. Accurate and consistent high quality data plays a critical role in the quality of many of the participating silos, from pre-trade analysis to trade execution to order routing to confirmation, matching, P&L integration, and risk assessment. The need doesn’t go away, but distributed data management can play a key role, and perhaps a more pragmatic role in addressing those needs. And the need to address these issues has never been greater, nor has the need for pragmatic and cost effective solutions. Data volumes in the markets are only increasing at a greater rate, including trade volumes, quote volumes, and IOI volumes. The sheer amount of information is crushing. The electronification of the markets is a reality that is here to stay. The complexity of these markets, the lack of geographic borders, and the blazing speed in which they move are defining the competitive landscape. And yet, one of the main implications of this phenomenon is an increase in the magnitude and velocity of the associated risk. We have seen the volcanic explosion of the financial markets and will now live and respond in the aftermath. But our response must be in the context of these very high volume, high speed, borderless markets that have been building all along. This means that the infrastructure of a firm must not only handle these market conditions, but now must also accommodate the increased need to measure and manage risk as well as respond to the impending call for increased regulatory oversight. So to suggest firms will need to do much more with much less is in fact, a gross understatement. DDM will not answer all pain points. There is no magic answer. But we have seen complex event processing (CEP) emerge as a new modality in computing that can enable a response from many of the component systems across the trade lifecycle, along with the ability to scale to the current and perhaps future market conditions. CEP, and especially CEP solutions architected to maintain ‘state’, such that they can link into and understand the status of individual silo systems, can be a perfect enabling technology for DDM. A well thought out CEP based distributed data management strategy can link together various systems to provide a firm the ability to scale and manage the inter-relationships between silos in a cost effective manner. Moreover, as the regulatory mandates begin to surface, a CEP based DDM architecture should allow a firm to respond, without having to throw more money and people at the requirements. In other words, this approach should allow firms to do more with less, which seems to be a reasonable path to pursue at this point in time.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Bloomberg Collaboration with Google Cloud Increases Access to Data And Analytics

Bloomberg continues to invest in cloud data access with an offering that enables customers of Google Cloud to accelerate their data strategies through the integration of Bloomberg’s cloud-based data management solution, Data License Plus (DL+), with BigQuery, Google Cloud’s fully managed, serverless data warehouse. The collaboration also allows mutual customers to access Bloomberg Data License...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...