About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Scale and Governance Top Drivers of Modern Data Architecture Plans: Webinar Review

Subscribe to our newsletter

Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands.

They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of their data estate.

These findings, from the latest A-Team Group Data Management Insight webinar, highlights the value that financial institutions see in modern data architectures, in which fragmented systems and legacy software is eschewed in favour of automated data pipelines that not only bring efficiencies but also enable the wringing of more value from that data.

In polls held during the webinar attendees, which included executives and managers from all facets of the financial data world, said by an overriding majority that they had embarked on a programme of architecture modernisation. Just a quarter said they were still considering such a move. None said they were not in the market for modernisation.

The polls were held amid deep discussion on the subject by guest expert speakers, which comprised Jez Davies, Global Head of Data Products at Northern Trust Asset Servicing; Robert Mueller, Director Senior Group Manager Technology Product Owner at BNY; and Neil Vernon, Chief Product Officer at Gresham Technologies,

Speakers were in agreement with the poll findings. From their wide-ranging discussion, here are three key observations.

Data Isn’t the Only Driver…

The impetus to implement a modern data architecture are varied and not confined to the need a refresh of institutions’ technology or data estates. Regulatory compliance formed a key reason for adopting new architectures because the speed at which regulations have been introduced or changed in recent years has called for an agile and flexible data setup, the webinar heard.

Frameworks such as BCBS 239, Basel 3 and T+1 settlement rules require the sort of nimbleness that is impossible to achieve in fragmented legacy architectures, which can expose organisations to compliance breaches and penalties.

…But it is the Biggest One

Poor quality, incomplete and unstructured data were all responsible in some way for a multitude of operational and regulatory failures that are only likely to swell as pressures placed on firms’ systems grow amid a surge in data generation and integration.

Shortfalls in data will also render AI processes potentially dangerous if the applications into which it’s built are fed incorrect information, the webinar heard.

Core Principles and Practical Approaches

The differentiating principles of streamlined, modern data architectures stand in contrast with common legacy systems that are at the heart of problems like the inability to find data owners, who should fundamentally reside within the business rather than IT. A modern framework must feature clear business ownership, quality built in from the outset, controls across the entire data lifecycle, automated exception management for rapid problem resolution, and robust governance. Crucially, data should be considered first, not last, during any change initiatives.

In practice, deployment approaches vary, with the webinar being told that modernisation efforts often appear “piecemeal” and “siloed” across different areas within an institution because implementation is often executed in a piecemeal way.

A recurring theme was the concept of “data as a product”, which involves elevating data beyond raw values to include comprehensive documentation, user guides, and contextual information, making it understandable and consumable.

Path Forward

Implementing modern architectures within legacy systems is generally acknowledged as difficult, but as financial institutions continue to rely on data-driven processes and ingest ever-larger volumes of information, new architectures will become a necessity, the webinar concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

Reporting Seen Among Use Cases Benefiting from Cloud-based Data Management for AI

Artificial intelligence is being adopted by financial regulators at pace, putting pressure on the financial institutions that the overseers serve to double down on their reporting capabilities. It’s no surprise to find that the same AI that’s helping regulators can aid organisations in getting those reporting procedures in place. To do so, however, they need...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...