Financial institutions are investing in modern technology architectures to bolster the flexibility and scalability of their data management processes as the number of use cases for that information, and the volume of data they ingest, expands.
They are also seizing on the latest architectures to strengthen data governance in response to the growing complexity of their data estate.
These findings, from the latest A-Team Group Data Management Insight webinar, highlights the value that financial institutions see in modern data architectures, in which fragmented systems and legacy software is eschewed in favour of automated data pipelines that not only bring efficiencies but also enable the wringing of more value from that data.
In polls held during the webinar attendees, which included executives and managers from all facets of the financial data world, said by an overriding majority that they had embarked on a programme of architecture modernisation. Just a quarter said they were still considering such a move. None said they were not in the market for modernisation.
The polls were held amid deep discussion on the subject by guest expert speakers, which comprised Jez Davies, Global Head of Data Products at Northern Trust Asset Servicing; Robert Mueller, Director Senior Group Manager Technology Product Owner at BNY; and Neil Vernon, Chief Product Officer at Gresham Technologies,
Speakers were in agreement with the poll findings. From their wide-ranging discussion, here are three key observations.
Data Isn’t the Only Driver…
The impetus to implement a modern data architecture are varied and not confined to the need a refresh of institutions’ technology or data estates. Regulatory compliance formed a key reason for adopting new architectures because the speed at which regulations have been introduced or changed in recent years has called for an agile and flexible data setup, the webinar heard.
Frameworks such as BCBS 239, Basel 3 and T+1 settlement rules require the sort of nimbleness that is impossible to achieve in fragmented legacy architectures, which can expose organisations to compliance breaches and penalties.
…But it is the Biggest One
Poor quality, incomplete and unstructured data were all responsible in some way for a multitude of operational and regulatory failures that are only likely to swell as pressures placed on firms’ systems grow amid a surge in data generation and integration.
Shortfalls in data will also render AI processes potentially dangerous if the applications into which it’s built are fed incorrect information, the webinar heard.
Core Principles and Practical Approaches
The differentiating principles of streamlined, modern data architectures stand in contrast with common legacy systems that are at the heart of problems like the inability to find data owners, who should fundamentally reside within the business rather than IT. A modern framework must feature clear business ownership, quality built in from the outset, controls across the entire data lifecycle, automated exception management for rapid problem resolution, and robust governance. Crucially, data should be considered first, not last, during any change initiatives.
In practice, deployment approaches vary, with the webinar being told that modernisation efforts often appear “piecemeal” and “siloed” across different areas within an institution because implementation is often executed in a piecemeal way.
A recurring theme was the concept of “data as a product”, which involves elevating data beyond raw values to include comprehensive documentation, user guides, and contextual information, making it understandable and consumable.
Path Forward
Implementing modern architectures within legacy systems is generally acknowledged as difficult, but as financial institutions continue to rely on data-driven processes and ingest ever-larger volumes of information, new architectures will become a necessity, the webinar concluded.
Subscribe to our newsletter