About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Subscribe to our newsletter

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands.

They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of their data estate.

These findings, from the latest A-Team Group Data Management Insight webinar, highlights the value that financial institutions see in modern data architectures, in which fragmented systems and legacy software is eschewed in favour of automated data pipelines that not only bring efficiencies but also enable the wringing of more value from that data.

In polls held during the webinar attendees, which included executives and managers from all facets of the financial data world, said by an overriding majority that they had embarked on a programme of architecture modernisation. Just a quarter said they were still considering such a move. None said they were not in the market for modernisation.

The polls were held amid deep discussion on the subject by guest expert speakers, which comprised Jez Davies, Global Head of Data Products at Northern Trust Asset Servicing; Robert Mueller, Director Senior Group Manager Technology Product Owner at BNY; and Neil Vernon, Chief Product Officer at Gresham Technologies,

Speakers were in agreement with the poll findings. From their wide-ranging discussion, here are three key observations.

Data Isn’t the Only Driver…

The impetus to implement a modern data architecture are varied and not confined to the need a refresh of institutions’ technology or data estates. Regulatory compliance formed a key reason for adopting new architectures because the speed at which regulations have been introduced or changed in recent years has called for an agile and flexible data setup, the webinar heard.

Frameworks such as BCBS 239, Basel 3 and T+1 settlement rules require the sort of nimbleness that is impossible to achieve in fragmented legacy architectures, which can expose organisations to compliance breaches and penalties.

…But it is the Biggest One

Poor quality, incomplete and unstructured data were all responsible in some way for a multitude of operational and regulatory failures that are only likely to swell as pressures placed on firms’ systems grow amid a surge in data generation and integration.

Shortfalls in data will also render AI processes potentially dangerous if the applications into which it’s built are fed incorrect information, the webinar heard.

Core Principles and Practical Approaches

The differentiating principles of streamlined, modern data architectures stand in contrast with common legacy systems that are at the heart of problems like the inability to find data owners, who should fundamentally reside within the business rather than IT. A modern framework must feature clear business ownership, quality built in from the outset, controls across the entire data lifecycle, automated exception management for rapid problem resolution, and robust governance. Crucially, data should be considered first, not last, during any change initiatives.

In practice, deployment approaches vary, with the webinar being told that modernisation efforts often appear “piecemeal” and “siloed” across different areas within an institution because implementation is often executed in a piecemeal way.

A recurring theme was the concept of “data as a product”, which involves elevating data beyond raw values to include comprehensive documentation, user guides, and contextual information, making it understandable and consumable.

Path Forward

Implementing modern architectures within legacy systems is generally acknowledged as difficult, but as financial institutions continue to rely on data-driven processes and ingest ever-larger volumes of information, new architectures will become a necessity, the webinar concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

Total Portfolio Views Unlock Value from Public-Private Investments: Webinar Review

Total portfolio views within investment management platforms are becoming critical to capital markets participants as private and alternative market assets comprise an ever-larger part of institutions’ investment and risk-management strategies. Having a holistic view enables organisations to unlock the greatest value from their data, a recent A-Team Group Data Management Insight webinar discussed. Aiding in...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...