About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: From Lehman to Amazon – Rethinking Financial Data Management

Subscribe to our newsletter

By Richard Petti, CEO, Asset Control

The tide of regulation is rising inexorably, swamping the financial services sector with ever more prescriptive disclosure requirements; from Dodd-Frank to Basel III and Solvency II, the regulatory response to the enduring financial crisis continues to evolve but the direction of change is constant.

It’s widely acknowledged that the sector’s pre-crisis data architecture failed to support the management of financial risks. Banks’ inability to report risk exposures and identify concentrations quickly and accurately, undermined the stability of the system and left financial institutions vulnerable. Institutions and regulators have since looked to strengthen IT infrastructure to help mitigate risk.

But the belief that the sector’s problems can be solved by yet more data misses the point. It’s time to rethink financial data management.

The solution lies not in mighty infrastructure and huge repositories of data, but in treating the management of market and risk information as a matter of logistics. By shifting the focus away from accumulation and onto delivery, the development of a data supply chain model can help ensure that financial services organizations receive accurate risk information, reliably and on time – all the time.

Organizations need to develop a framework that begins with the ends, not the means. Progress requires the ‘Amazon-ization’ of financial data management, where activity is focused on ensuring the right package of data is delivered to the customer’s inbox at the right time – with everything working backwards from that primary objective.

Need for speed

The Principles for Effective Risk Data Aggregation and Risk Reporting included in Basel III mandate banks to impose strong data governance to assure the secure organization, assembly and production of risk information. The principles, similar to Dodd-Frank in the US, begin with traditional notions of soundness: risk reporting should be transparent,

and the sourcing, validation, cleansing and delivery of data should be tightly controlled and auditable. But the new regulatory model also makes timeliness and adaptability fundamental requirements. This is a significant change from Basel II, which addressed the formulation of risk models in detail but, in retrospect, failed to identify the need for speed.

The data supply chain approach is a challenge to incumbent models that largely focus on the aggregation and organization of huge volumes of data. In a dynamic environment complicated by the seemingly boundless diversity of financial instruments, multiple data sources, dozens of downstream systems and myriad reporting requirements, institutions have to look beyond ‘big data’ to the dynamics of their organization and information needs.

They want relevant, consistent and accurate data that can provide them with reliable positions, based on the timely and appropriate delivery of reference, pricing and volatility profiles, under consistently defined risk scenarios. End-of-day reporting is the minimum standard – in a global marketplace stretched across multiple time zones, several daily snapshots are needed. The challenge is monumental – but fighting size with size is impractical.

The Solution

The old approach of building a vast bucket of ‘golden data’ is a static concept that’s no longer fit for purpose. On its own, the golden data set is worthless. The value lies not in the volume, but in how it is put to use. To derive maximum value, financial data management systems need reorienting: dynamic concepts must replace static frameworks.

The core components of data management – capture, validation and delivery – remain the same. But to regard data aggregation and cleansing as the primary objective and justification of the system is to start at the wrong place.

The process should begin from the end-user’s perspective, with Chief Data Officers considering two key questions: Who am I delivering this data to? And under what Service-Level Agreement (SLA)? By adopting an SLA-led approach and focusing on the end-game of delivery, it becomes much easier to work backwards and align performance (and costs) with

business needs. With the overarching SLA as the start-point, data management becomes a logistics exercise whose primary objective is to get the right data, to the right people in time to meet their local SLAs – in effect, a data supply chain.

The Amazon model of delivery does not start at the warehouse – it begins, as it ends, with the customer. The entire supply chain is optimized to deliver the best possible customer experience. Financial data management must adopt the same model.

The second, critical shift in perspective is to recognize that change is a constant. In a vibrant market, products, processes and organizations are always subject to innovation and evolution. Financial data models need to be dynamic, adjusting quickly to capture new products created to solve client needs in new ways. The patterns of information distribution need to evolve too, as organizations adapt to changing market opportunities across asset classes and geography.

Increasingly, proactive organizations are deploying strategies that do indeed regard data management as a dynamic logistics activity. The most effective have placed a data management platform at the center of the complex multi-source, multi-system distribution process – taking inputs from vendor feeds and departmental sources, testing them for quality and routing them through the platform to downstream systems and users. As data flows through the system, the platform provides the framework for auditing activity and monitoring performance against critical SLAs.

Such systems simplify the technical challenges significantly. Because they eliminate potentially hundreds of point-to-point connections, they make the administration, control and delivery of reference, market and risk data much more manageable. Moreover, workflows become more efficient, enabling organizations to save time and money. Crucially, the centralized approach built around the effective development of a data supply chain, is helping companies mitigate risk and meet the growing demands of regulatory compliance.

Although we may have survived the consequences of regulatory and information failures that characterized the financial crisis, organizations cannot afford to be complacent. A reliance on inefficient legacy models will no longer suffice.

To progress, Chief Risk Officers and Chief Data Officers must drive the reconfiguration of financial data management – and establish it as a logistical exercise. By adopting an SLA-driven approach, the sector can make the journey from Lehman to Amazon. It’s time, quite literally, to deliver.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to trade surveillance for market abuse

Breaches of market abuse regulation can lead to reputational damage, eye-watering fines and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours; externally it can undermine confidence in markets and cause financial instability. This webinar will discuss market abuse of different types, such as insider trading...

BLOG

A-Team Group Data Management Summit London – Don’t Miss It!

With A-Team Group’s Data Management Summit London just two weeks away on 14 March 2024 at Hilton Canary Wharf, here is a taster of some of the keynotes, panels and practitioner journeys that will be presented by leaders and innovators in the data management field. The overall theme of the day is the evolution of...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...