About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Subscribe to our newsletter

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands.

They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of their data estate.

These findings, from the latest A-Team Group Data Management Insight webinar, highlights the value that financial institutions see in modern data architectures, in which fragmented systems and legacy software is eschewed in favour of automated data pipelines that not only bring efficiencies but also enable the wringing of more value from that data.

In polls held during the webinar attendees, which included executives and managers from all facets of the financial data world, said by an overriding majority that they had embarked on a programme of architecture modernisation. Just a quarter said they were still considering such a move. None said they were not in the market for modernisation.

The polls were held amid deep discussion on the subject by guest expert speakers, which comprised Jez Davies, Global Head of Data Products at Northern Trust Asset Servicing; Robert Mueller, Director Senior Group Manager Technology Product Owner at BNY; and Neil Vernon, Chief Product Officer at Gresham Technologies,

Speakers were in agreement with the poll findings. From their wide-ranging discussion, here are three key observations.

Data Isn’t the Only Driver…

The impetus to implement a modern data architecture are varied and not confined to the need a refresh of institutions’ technology or data estates. Regulatory compliance formed a key reason for adopting new architectures because the speed at which regulations have been introduced or changed in recent years has called for an agile and flexible data setup, the webinar heard.

Frameworks such as BCBS 239, Basel 3 and T+1 settlement rules require the sort of nimbleness that is impossible to achieve in fragmented legacy architectures, which can expose organisations to compliance breaches and penalties.

…But it is the Biggest One

Poor quality, incomplete and unstructured data were all responsible in some way for a multitude of operational and regulatory failures that are only likely to swell as pressures placed on firms’ systems grow amid a surge in data generation and integration.

Shortfalls in data will also render AI processes potentially dangerous if the applications into which it’s built are fed incorrect information, the webinar heard.

Core Principles and Practical Approaches

The differentiating principles of streamlined, modern data architectures stand in contrast with common legacy systems that are at the heart of problems like the inability to find data owners, who should fundamentally reside within the business rather than IT. A modern framework must feature clear business ownership, quality built in from the outset, controls across the entire data lifecycle, automated exception management for rapid problem resolution, and robust governance. Crucially, data should be considered first, not last, during any change initiatives.

In practice, deployment approaches vary, with the webinar being told that modernisation efforts often appear “piecemeal” and “siloed” across different areas within an institution because implementation is often executed in a piecemeal way.

A recurring theme was the concept of “data as a product”, which involves elevating data beyond raw values to include comprehensive documentation, user guides, and contextual information, making it understandable and consumable.

Path Forward

Implementing modern architectures within legacy systems is generally acknowledged as difficult, but as financial institutions continue to rely on data-driven processes and ingest ever-larger volumes of information, new architectures will become a necessity, the webinar concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Nine Recently Updated Private-Market Data and Technology Offerings

Capital-market volatility, squeezed margins and geopolitical tensions are encouraging asset managers to look more broadly across asset classes to spread risk and increase returns. Private markets and other alternative assets have been huge beneficiaries of this trend and are likely to continue gaining share of invested capital, with Preqin estimating that investment in private markets...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...