About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

AIB’s McMorrow Explains Benefits of Teradata Warehouse Implementation and Ongoing Challenges

Subscribe to our newsletter

Allied Irish Bank’s (AIB) enterprise data warehouse project manager Michael McMorrow is a great proponent of Teradata’s functionally neutral and self-managing approach to data storage. The bank rolled out the vendor’s Teradata Warehouse solution a few years ago and is now focused on keeping up with the data management changes required as a result of new source system inputs, such as accounting system changes, he explains.

“If you are rolling out a data warehouse solution it needs to be a genuinely neutral model; don’t model the solution too closely to the idiosyncrasies of your bank. Otherwise you will constantly be reacting to new requirements. If the data is consistently managed and robust, a new report for regulatory purposes shouldn’t be daunting,” says McMorrow.

Before its rollout of the Teradata warehouse, AIB collected customer data via an internally developed customer information file solution, which provided a single view of the customer. However, this solution was not robust enough to suit the needs of the end users in terms of data analytics, as it was a lengthy process to develop new functionality, and thus it opted for the Teradata offering. The bank’s existing customer data was then migrated onto the new solution and augmented with customer history and transaction history data.

McMorrow explains that there is a semantic layer between the warehouse and the user outputs, including analytics, management information system (MIS) and reporting systems, which means end users are unable to affect the centrally stored data sets. The source systems that feed the data into the warehouse are also responsible for data cleansing, so that the warehouse itself is focused on keeping the golden copy that is produced clean. “The interface model means that AIB can change the source systems but we are able to shield the rest of the system from these changes,” he explains.

To this end, the bank is currently reengineering its core accounting systems and it is up to the data warehouse to rework its internal data storage systems to take account of these changes. “The changes mean that fundamental bits of data are changing in structure and it is a real challenge to both map the new data to the old and alter the systems where required,” says McMorrow.

At the start of the data management process, McMorrow indicates that the challenge of senior management sponsorship was significant: “We had five changes in CEO during the implementation process.” Although the rollout has now been completed, the data warehousing team must also remain wary of any budget cutting activity that may negatively impact the maintenance of the system. “We are also aware of the problems around a legacy of usage when staff move on. We need to make sure that strong governance and strategy is maintained by retaining specific data stewards for each data unit. These stewards sign assurance forms to ensure a strong process for the ownership of that data,” he elaborates.

Each phase of development also needs to provide tangible rewards in terms of either cost savings or benefits, adds McMorrow. “However, something that may be harder to prove is the inherited benefit of previous implementations,” he says. “For example, our customer data warehousing project meant we were later able to kick off our Basel project and this is hard to quantify directly.”

McMorrow’s philosophy for a data warehouse is therefore for it to be treated akin to a utility that is charged back to the end user for service provision. Although the warehouse is responsible for data verification, it is not responsible for data cleansing and is therefore similar to a system of record. “End users have to sponsor any changes that need to be made and if they wish to receive new data sets,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

12 November 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results — and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and...

BLOG

Compliance Innovation at Droit: Bridging Symbolic Logic and GenAI

Compliance teams frequently face overwhelming regulatory shifts like those imposed by MiFID II or EMIR Refit. For many firms, understanding exactly how a new mandate impacts day-to-day operations can feel overwhelming – unless, of course, you can assess the operational impacts and immediately trace those mandates back to the source text itself. Since its founding...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...