About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Subscribe to our newsletter

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands.

They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of their data estate.

These findings, from the latest A-Team Group Data Management Insight webinar, highlights the value that financial institutions see in modern data architectures, in which fragmented systems and legacy software is eschewed in favour of automated data pipelines that not only bring efficiencies but also enable the wringing of more value from that data.

In polls held during the webinar attendees, which included executives and managers from all facets of the financial data world, said by an overriding majority that they had embarked on a programme of architecture modernisation. Just a quarter said they were still considering such a move. None said they were not in the market for modernisation.

The polls were held amid deep discussion on the subject by guest expert speakers, which comprised Jez Davies, Global Head of Data Products at Northern Trust Asset Servicing; Robert Mueller, Director Senior Group Manager Technology Product Owner at BNY; and Neil Vernon, Chief Product Officer at Gresham Technologies,

Speakers were in agreement with the poll findings. From their wide-ranging discussion, here are three key observations.

Data Isn’t the Only Driver…

The impetus to implement a modern data architecture are varied and not confined to the need a refresh of institutions’ technology or data estates. Regulatory compliance formed a key reason for adopting new architectures because the speed at which regulations have been introduced or changed in recent years has called for an agile and flexible data setup, the webinar heard.

Frameworks such as BCBS 239, Basel 3 and T+1 settlement rules require the sort of nimbleness that is impossible to achieve in fragmented legacy architectures, which can expose organisations to compliance breaches and penalties.

…But it is the Biggest One

Poor quality, incomplete and unstructured data were all responsible in some way for a multitude of operational and regulatory failures that are only likely to swell as pressures placed on firms’ systems grow amid a surge in data generation and integration.

Shortfalls in data will also render AI processes potentially dangerous if the applications into which it’s built are fed incorrect information, the webinar heard.

Core Principles and Practical Approaches

The differentiating principles of streamlined, modern data architectures stand in contrast with common legacy systems that are at the heart of problems like the inability to find data owners, who should fundamentally reside within the business rather than IT. A modern framework must feature clear business ownership, quality built in from the outset, controls across the entire data lifecycle, automated exception management for rapid problem resolution, and robust governance. Crucially, data should be considered first, not last, during any change initiatives.

In practice, deployment approaches vary, with the webinar being told that modernisation efforts often appear “piecemeal” and “siloed” across different areas within an institution because implementation is often executed in a piecemeal way.

A recurring theme was the concept of “data as a product”, which involves elevating data beyond raw values to include comprehensive documentation, user guides, and contextual information, making it understandable and consumable.

Path Forward

Implementing modern architectures within legacy systems is generally acknowledged as difficult, but as financial institutions continue to rely on data-driven processes and ingest ever-larger volumes of information, new architectures will become a necessity, the webinar concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

Mainframes’ Utility in Deriving Value from Data Endures: Webinar Review

Despite advances in modern data architecture and hosting strategies, a majority of financial firms still house more than half of their data on mainframes, presenting them with novel data management pressures, an A-Team Group webinar discussed. Capital market participants and data professionals who viewed the event – entitled Are you making the most of the...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...