About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: Data Management – Keeping the Financial Market Airborne

Subscribe to our newsletter

By Martijn Groot, VP Product Strategy, Asset Control

In this new era of stress testing, how many financial institutions risk being grounded by the financial equivalent of air traffic control for failing to meet the new, challenging risk requirements? The production of risk and stress data is no longer a one off risk management exercise – it has to be anchored in core operations – but few financial institutions have amended their data management strategies accordingly. The truth is that new data reporting requirements are the final push for organisations that have resisted the need to radically overhaul strategies. From stress testing to meeting evolving client demands, institutions must attain control over the huge volumes of data – from historical to real time – now required both operationally within the back office and within the mid-office risk team.

Financial institutions can no longer rely on a clearly-defined, sequential data management process. Just as air traffic control systems manage a complex network of aeroplanes, routes and airports in real time, organisations have to control a complex network of interdependent, multi-tiered data sources that inform diverse risk reporting needs and drive operational actions. In both situations data trust and timing are critical.  Organisations can no longer afford to delay; it is time, insists Martijn Groot, VP Product Strategy, Asset Control, to embrace a fundamentally different data management paradigm.

Dynamic Data

The data management challenge facing financial institutions is unprecedented. This is not just about the much-discussed regulatory demands that will require organisations to deliver more frequent reporting, but also the need to respond to short notice demands for different sets of risk exposures. This is likewise about competitive position and the ability to respond to retail and corporate client expectations for a radical change in day-to-day interaction, not least online, real time access to all products and services. These two key business demands are far from mutually exclusive. In both cases there is a need for rapid data aggregation and a clear view of data interdependencies; while the in-depth scrutiny of every stress test by the mainstream media reveals the growing synergy between regulatory demand and an institution’s reputation.

As a result, the traditional, highly sequential data management model simply no longer applies. In an environment that demands the continual production of risk and stress data, organisations require a far more dynamic environment that supports both the provision of more frequent data, intraday or even instantaneous, and requests for information throughout the day from business users with diverse requirements – both back and mid office.

Rather than the process-driven, responsive data management approach that has dominated the industry to date, financial institutions need to consider a more proactive, interwoven network of the data management demands – something akin to the air traffic control systems managing the view of air traffic controllers tracking the different flight paths to multiple airports and the multitudinous airborne aircraft.  Today’s data management process is complex, multi-tiered, multi-dimensional and multi-source, and can be treated as nothing else.

Data-Driven Organisation

Consider the aeroplanes as the dynamic sources of information, such as a 4pm market snap, new instruments or the day’s bond redemption notices; the users are the business processes and reporting needs of the organisations, including finance reports, risk management, corporate actions, and settlements; and the airports are the information destinations.  The challenge for the operator is to ensure every plane gets to its correct destination and on time, and for the financial institution to ensure the high quality datasets arrive where they need to be to drive those critical business processes on time.

Making that transition from sequential to complex, multi-layered and multi-dimensional data management is a significant step – and one that no financial institution has yet achieved in full. However, some positive changes are occurring. For example, many data suppliers have already adapted products to allow for more interactive delivery. Rather than just shipping a daily file as in the past, now these products allow intraday and on-demand delivery based on specific request lists.

Furthermore, in addition to adapting the data source to a more frequent delivery model, these products are far more open, with vendors providing interactive access to enterprise products. This is freeing up institutions to evolve from the restrictions of standard data sources to adopt a range of solutions that fit specific needs, from instruments to data fields.

The biggest constraint in developing this evolution from sequential data management to the controlled interdependency of the new model is the underpinning information architecture used by financial institutions. Today, most banks’ IT systems and data management infrastructures are ‘frozen in time’, tailored specifically to the deep-rooted sequential model – organisations have simply not made the fundamental move to the more flexible, intra-day and ad hoc data delivery approach that is now essential. It is this infrastructure change that will fortify evolution towards a truly effective, data-driven organisation.

In Control

The value of this shift in approach is significant. Clearly it enables institutions to respond quickly and efficiently – with minimal expensive manual intervention – to the specific requirements of regulation.  Furthermore it provides a platform for far more ‘out of the box’ risk data management – for example using time series data management to assess different stress test scenarios side by side and exploiting interdependencies between them to chart ripple effects, clarifying the risk position at every level.

Organisations will also have the chance to exploit innovative dashboard control mechanisms to reflect KPIs around the data supply chain and track the performance of activities driven by both internal and external data sources. Essentially this approach will create a real-time ‘flight board’ with all the deliverables, destinations and statuses providing a single screen overview of the daily information cycle, including data dependencies.

Armed with this insight, an organisation can expose more quality metrics, for example providing a view by instrument category and by region that can be tied directly to both Service Level Agreements and KPIs – and make that essential evolution from ‘model first, data second’ to the far more trusted ‘data first’ risk management approach now required.

Conclusion

Creating a better balance between data governance and the definition of the terms and the model is an essential step and one that all institutions will have to achieve by 2020 at the latest. Organisations must move swiftly to create a new control framework in governance and KPIs to track the effective delivery of data throughout the supply chain and exploit more ‘out of the box’ risk management tools to respond rapidly to new regulatory demands.

The potential benefits go far beyond regulatory compliance. For early adopters, the inherent flexibility and ability to slice and dice complex and diverse data sets on demand, in real-time, will enable institutions to live up to the spirit of the new regulatory environment – a factor which will be increasingly crucial in building corporate reputation. Factor in the ability to swiftly add new data sources, create new reports and introduce new products and services without requiring any infrastructure change, and organisations will be on the pathway to attaining the essential agility required to increase competitive position in a challenging marketplace.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Hearing from the Experts: AI Governance Best Practices

9 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical...

BLOG

FINRA’s 2024 New and Updated Guidelines on AI and GenAI/LLM Integration

In 2024, the Financial Industry Regulatory Authority (FINRA) expanded its guidance on integrating Artificial Intelligence (AI) within the securities industry, placing particular emphasis on generative AI (GenAI) and large language models. This builds upon FINRA’s ongoing efforts since 2020 to ensure that member firms adhere to existing regulatory frameworks while adopting advanced technologies. The recent...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...