The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS Focuses on Data Quality in Order to Better Support its Risk Function

Share article

In addition to its entity data management project, the Royal Bank of Scotland (RBS) is also working to improve the overall quality of its reference data as part of its One Risk project. Mark Davies, head of reference data at RBS, explains to Reference Data Review that the firm’s risk function prompted the move, which is initially focused on understanding how reference data is maintained and system feeds are being used by different parts of the firm.

“A lot of the focus on data quality within RBS this year has been driven by the risk users,” explains Davies. “There is a huge amount of activity going on around improving data quality for the risk community, as well as our legal entity work.” The legal entity work Davies is referring to is the project that has been focused on decommissioning the legal entity database acquired during the ABN Amro buyout and moving that data onto its centralised RBS system. It has thus far involved the RBS team taking a database that started out at around 500,000 legal entities and which is now made up of the best part of 1.5 million

The risk project, on the other hand, is much more wide ranging in its focus than the legal entity work, which is focused solely on RBS’ wholesale entity data management system. “The One Risk project means we are working with both group and division level teams and we are looking to see where we can standardise,” explains Davies.

He notes that there are currently instances across the firm (as is the norm within other large firms with legacy architectures) where teams are talking about the same information but managing it in different ways. They may have different systems in place to manage that data or even have different lists of data available to them. Davies and his team are therefore aiming to find instances where the risk, finance and operations teams can all use the same data in the same format. The team will then determine how to manage that process and ensure that data quality levels remain the same over time.

All of this is aimed at enhancing the data underlying functions such as risk management across RBS. “Banks are increasingly trying to find new and better ways to manage risk such as getting quicker and better answers on sector concentrations. There is an appetite within risk to get those figures quickly and right first time. Also, single name exposure is a focus: understanding exposure to a single company or a group. This is often a difficult thing to do because it relies on accurate hierarchy information, which can be difficult to manage,” explains Davies. All of this necessarily entails the monitoring of data quality across the downstream systems involved.

Davies elaborates: “We are looking at how data is maintained and system feeds are being used by different parts of the same bank – are they getting their data from the same sources and using it in the same ways? Have they got the same definitions for ownership?”

The firm is currently only a few months into the One Risk project, which is being managed by the BAU team, and there is a lot more still to be done. There is a huge amount of analysis to get through in order to determine where it is appropriate for the firm’s user groups to have a single shared view of the underlying reference data.

These data quality checks should also help RBS to avoid infractions such as the one that cost it £5.6 million last month, by having a better handle on data management practices across the group. The UK Financial Services Authority’s (FSA) fined RBS Group at the start of August for the group’s failure to have adequate data checks in place to conduct mandatory anti-money laundering (AML) screening.

Related content

WEBINAR

Upcoming Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

Date: 21 January 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions...

BLOG

Using Artificial Intelligence to Know Your Customers, not your Criminals

By Hugo Chamberlain, smartKYC. How can just searching for ‘adverse media’ on your customer mean you are getting to ‘know’ them at all? In this article we ask, are we really harnessing the full power of KYC technologies? With an abundance of mounting regulations, it is only natural that Know Your Customer screening has been...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...