About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Mainframes’ Utility in Deriving Value from Data Endures: Webinar Review

Subscribe to our newsletter

Despite advances in modern data architecture and hosting strategies, a majority of financial firms still house more than half of their data on mainframes, presenting them with novel data management pressures, an A-Team Group webinar discussed.

Capital market participants and data professionals who viewed the event – entitled Are you making the most of the business-critical structured data stored in your mainframes? – told surveys held during the discussion, that they use such architectures for reporting and artificial intelligence initiatives, strategic business decisions and real-time analytics.

The enduring utility of the mainframes, and the opportunities and challenges they pose, occupied the thoughts of webinar panellists, who comprised Duncan Cooper, Chief Data Officer at Northern Trust; Sunny Jaisinghani, Head of Data-Global Custody Product at State Street Bank & Trust Company; and Michael Curry, President of Data Modernisation at Rocket Software, the event’s sponsor.

They noted that in the US, mainframes remain integral to financial operations, with 90 per cent of American banks still relying on them for the majority of transactional processing, core banking, payment processing and customer account management, alongside a surprising volume of unstructured data like customer statements.

Still Got It

Mainframes excel at vertical scaling for transactional applications and, due to their long history, have become authoritative data sources. And while some institutions are attempting to migrate their data to other architectures, they are finding the task a formidable one with benefits that can’t be clearly articulated given that mainframes are ubiquitous and their security features robust, the panellists said.

Opportunities with mainframe data are commonly missed, the webinar head, partly because the systems are regarded “like the data equivalent of the song ‘Hotel California’” – it’s easy to input data but difficult to extract it.

But the panel agreed that mainframes offer real-time analytics from transaction flows for fraud detection, risk management and enhanced customer service. And with the advent of Generative AI (GenAI), there now exists novel pathways to unlock value from unstructured mainframe data, such as summarising documents to improve customer interactions.

AI Aid

AI can also be used to overcome any difficulties presented by the use of mainframes, including integration challenges, metadata access and seamless integration with cloud data, cultural disparities between mainframe and cloud teams and a looming skills gap as the mainframe workforce ages.

For instance, mainframe complexity can be abstracted through AI, fostering data fluidity across platforms and adopting an incremental “strangler approach” to modernisation rather than wholesale replacement. Decoupling user experience from the mainframe’s backend capabilities allows for modern interfaces to leverage its inherent power, too.

This is all critical because AI will impact “literally everything” in data, the webinar heard. It is expected to accelerate data integration through automated linkages and mappings and, over the longer term, traditional data constructs may give way to an ontological view and knowledge graphs, fundamentally altering data management, the panel discussed.

AI is already aiding traditional data integration efforts, as organisations recognise the prerequisite of high-quality, integrated data for extracting AI value. GenAI, in particular, is proving invaluable in extracting structured data from unstructured documents.

Legacy Systems

AI tools can to analyse legacy mainframe code, translating it into human-readable functional specifications – a crucial capability given the retiring generation of programmers. Similarly, AI that can observe mainframe inputs and outputs to explain underlying business processes holds immense promise.

Ensuring complete, accurate and accessible enterprise data necessitates a proactive approach. Data quality measures must be implemented “closest to the data itself”, embedded into the processes generating or publishing it. Remediating data quality downstream is demonstrably ineffective and a proactive stance ensures that data users can implicitly trust the information, which is fundamental to its utility.

To draw the most benefit from their data, panellists argued for a holistic view of the entire data estate, including mainframe and unstructured data, integrated through comprehensive metadata. This will foster a clear understanding of how information fits together, they agreed. Also, a conscious and respectful approach to data management is vital, ensuring accuracy and defining “data contracts” that clarify data’s purpose.

This demands an organisational transformation, not merely a technological one, involving technology, product and operations teams alike.

Perhaps most critically, the panellists said, success ultimately hinges on people. Even with optimal technology and data quality, having “fanatical advocates for the use of data” within the organisation is paramount.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

GoldenSource OMNI Evolves as Buy-Side Demands Transform

Data cloud giant Snowflake’s forum in San Francisco last month was closely watched by the data management industry, especially GoldenSource. A year after its launch, the creators of GoldenSource’s OMNI data lake product for asset managers were keenly watching what Snowflake had to offer with an eye to enhancing the app’s own provisions for the...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...