The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Distributed Data Management: The Subtle Necessity in the Wake of the Crisis?

By Don DeLoach, CEO of Aleri Nobody would argue that these are interesting times. As we sit in the wake of the largest global financial crisis most of us have ever witnessed, we face market conditions like never before. A looming global recession seems clear, and the imperatives that come with that are becoming obvious. Banks, and companies in general, are taking action to cut the expense line wherever possible, knowing that they will have to do more with less. And yet, the move to electronic trading continues to accelerate, and the corresponding speed and complexity of the markets are on an irreversible trend.

The underlying infrastructure, which is already breaking under the load, will certainly have to adapt to the increase in volume and complexity. And in the wake of this crisis, there will surely be a clarion call, both in North America and Europe, for increased risk management applied to operations and increased requirements for regulatory oversight and reporting. It would seem that the investment required to keep pace would be an increase in systems and people, hardly in keeping with the overwhelming pressure to conserve cash and reduce expenses. In that light, distributed data management (DDM) offers a very valuable approach for achieving the right outcome. DDM may be a hybrid between two prevailing approaches to tackling the problem of consistent, high quality data across a firm. The centralised data management approach, sometimes known as the ‘golden copy’, is one method. This assumes you get one clean master copy of all of the key data in one place and service the silos throughout the firm. This is somewhat of a mothership approach, where the goals are indeed worthy, but execution of this approach is difficult. The enterprise data approach, sometimes known as a ‘golden model’, suggests a framework to which all participating silos comply, but the location and management of specific pieces of that data exist throughout the firm. This too can be difficult, but perhaps is more closely aligned to the concepts of DDM, which may emerge as somewhat of a workable variation of EDM. DDM is fundamentally an approach to deploying systems that seeks to synchronise, where appropriate, the information which exists in various standalone silo systems without forcing a firm-wide definition of all data. That said, it could allow a firm-wide definition of some data, such that the consistency needed across the firm can be addressed. Think of it as allowing each independent operation to accomplish what it needs to accomplish in the way it needs to act, yet linking together the information that needs to be linked. In banks and buy side firms, the idea of centralised data management utilising a golden copy of key data has been embraced well in concept, but large scale reference data initiatives have been difficult in terms of execution, where costs, schedule overruns, and internal political and turf battles prevent or at least significantly diminish the firm’s ability to meet the objectives of these projects. But the objectives of such projects, like a consistent view of key data needed to run the firm remains intact. Accurate and consistent high quality data plays a critical role in the quality of many of the participating silos, from pre-trade analysis to trade execution to order routing to confirmation, matching, P&L integration, and risk assessment. The need doesn’t go away, but distributed data management can play a key role, and perhaps a more pragmatic role in addressing those needs. And the need to address these issues has never been greater, nor has the need for pragmatic and cost effective solutions. Data volumes in the markets are only increasing at a greater rate, including trade volumes, quote volumes, and IOI volumes. The sheer amount of information is crushing. The electronification of the markets is a reality that is here to stay. The complexity of these markets, the lack of geographic borders, and the blazing speed in which they move are defining the competitive landscape. And yet, one of the main implications of this phenomenon is an increase in the magnitude and velocity of the associated risk. We have seen the volcanic explosion of the financial markets and will now live and respond in the aftermath. But our response must be in the context of these very high volume, high speed, borderless markets that have been building all along. This means that the infrastructure of a firm must not only handle these market conditions, but now must also accommodate the increased need to measure and manage risk as well as respond to the impending call for increased regulatory oversight. So to suggest firms will need to do much more with much less is in fact, a gross understatement. DDM will not answer all pain points. There is no magic answer. But we have seen complex event processing (CEP) emerge as a new modality in computing that can enable a response from many of the component systems across the trade lifecycle, along with the ability to scale to the current and perhaps future market conditions. CEP, and especially CEP solutions architected to maintain ‘state’, such that they can link into and understand the status of individual silo systems, can be a perfect enabling technology for DDM. A well thought out CEP based distributed data management strategy can link together various systems to provide a firm the ability to scale and manage the inter-relationships between silos in a cost effective manner. Moreover, as the regulatory mandates begin to surface, a CEP based DDM architecture should allow a firm to respond, without having to throw more money and people at the requirements. In other words, this approach should allow firms to do more with less, which seems to be a reasonable path to pursue at this point in time.

Related content

WEBINAR

Recorded Webinar: Managing the transaction reporting landscape post Brexit: MiFID II, SFTR, EMIR

The transaction reporting landscape has, for many financial institutions, expanded considerably in size since the end of the UK’s Brexit transition period on 31 December 2020 and the resulting need for double reporting of some transactions to both EU and UK authorities. It has also changed dramatically following the UK government’s failure to reach equivalence...

BLOG

ESG: The Next Frontier for Financial Services

ESG is everyone’s favourite topic right now, but the mass of different regulations, standards and reporting requirements can be confusing. Last week at the RegTech Summit Virtual 2020, we were delighted to speak with Beate Born, Head of Strategic Projects, Investment and Trading Platforms at UBS Wealth Management, to discuss how firms could best address...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

Practicalities of Working with the Global LEI

This special report accompanies a webinar we held on the popular topic of The Practicalities of Working with the Global LEI, discussing the current thinking around best practices for entity identification and data management. You can register here to get immediate access to the Special Report.