About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

BoE and FCA Plan Report on Impact of AI in Financial Services

Subscribe to our newsletter

The Bank of England (BoE) and the Financial Conduct Authority (FCA) have completed a one-year exploration into the deployment of artificial intelligence (AI) by financial institutions, and what implications this may have on the financial system, including the potential need for additional regulatory standards. The findings – from four quarterly meetings and a series of workshops with private-sector organisations – will be published in a final report to follow.

The BoE’s former Governor Mark Carney had announced the formation of an AI Public-Private Forum (AIPPF) in 2019. But due to the impact of Covid-19, the first meeting was delayed until October 2020, with the expectation that the initiative would run for 12 months. The forum was jointly hosted by the BoE and the FCA with members including individuals from leading banks, corporates and asset management firms.

The final meeting of the AIPPF was held earlier this month and focused on the issues surrounding governance, with members having discussed whether there is a role for additional regulatory standards, what those standards might be and whether there should be a certified auditing regime for AI, according to the minutes published by the Bank. Following this final quarterly meeting, the Forum will also hold a number of workshops focusing on governance related topics.

Moderator Varun Paul, Head of the Fintech Hub at the BoE, said that a final report will be published on conclusion of the AIPPF and noted that the Bank and FCA will be thinking about what future engagement with the financial industry more broadly could look like in light of the lessons learned through the AIPPF. According to Paul, this includes how to take forward the numerous findings and recommendations that have come out of the Forum and will be included in the final report.

Dave Ramsden, Deputy Governor, Markets and Banking at the BoE had also noted earlier this year that the AIPPF will produce a final report upon its conclusion, adding that as the Bank and the industry learn more from this soft support around AI, “it may be that harder forms of public infrastructure will be needed, provided either by the Bank, another authority or as a collaboration between different authorities”.

“Those decisions will be taken at the appropriate time,” he said. “If that means that the Bank’s direct involvement in these areas passes over to another institution, then that’s fine: we will have played our role at the right time.”

According to the final meeting minutes of the AIPPF, members had also discussed if the regulator could ask companies to develop policies that outline how they have considered the ethics of AI and suggested that certification could be used as a mark of recognition to identify those firms that have developed AI policies. Members also suggested that audits of such policies could extend to looking at how firms plan to remediate errors in the case of false positives.

Joint Chair Dave Ramsden, Deputy Governor, Markets and Banking at the BoE, said that the topic of governance was crucial to the safe adoption of AI in UK financial services. He explained that AI may differ from other new and emerging technologies from a governance perspective because AI can limit, or even potentially eliminate, human judgement and oversight from key decisions. He added that this clearly poses challenges to existing governance frameworks in financial services and the concepts of individual and collective accountability that are enshrined in those, such as the elements of the Senior Managers and Certification Regime (SM&CR).

Co-Chair Jessica Rusu, Chief Data, Information and Intelligence Officer (CDIIO) at the FCA added that it was also important that regulators learn from industry practice and current governance approaches to AI and jointly explore how governance can contribute to ethical, safe, robust and resilient use of AI in financial services.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Managing Non-Financial Misconduct Under SMCR

9 October 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Non-financial misconduct—encompassing behaviours such as bullying, sexual harassment, and discrimination is a key focus of the Senior Managers and Certification Regime (SMCR). The Financial Conduct Authority (FCA) has underscored that such misconduct is not only unethical but also poses significant risks...

BLOG

Compliance Innovation at Droit: Bridging Symbolic Logic and GenAI

Compliance teams frequently face overwhelming regulatory shifts like those imposed by MiFID II or EMIR Refit. For many firms, understanding exactly how a new mandate impacts day-to-day operations can feel overwhelming – unless, of course, you can assess the operational impacts and immediately trace those mandates back to the source text itself. Since its founding...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...