About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS Focuses on Data Quality in Order to Better Support its Risk Function

Subscribe to our newsletter

In addition to its entity data management project, the Royal Bank of Scotland (RBS) is also working to improve the overall quality of its reference data as part of its One Risk project. Mark Davies, head of reference data at RBS, explains to Reference Data Review that the firm’s risk function prompted the move, which is initially focused on understanding how reference data is maintained and system feeds are being used by different parts of the firm.

“A lot of the focus on data quality within RBS this year has been driven by the risk users,” explains Davies. “There is a huge amount of activity going on around improving data quality for the risk community, as well as our legal entity work.” The legal entity work Davies is referring to is the project that has been focused on decommissioning the legal entity database acquired during the ABN Amro buyout and moving that data onto its centralised RBS system. It has thus far involved the RBS team taking a database that started out at around 500,000 legal entities and which is now made up of the best part of 1.5 million

The risk project, on the other hand, is much more wide ranging in its focus than the legal entity work, which is focused solely on RBS’ wholesale entity data management system. “The One Risk project means we are working with both group and division level teams and we are looking to see where we can standardise,” explains Davies.

He notes that there are currently instances across the firm (as is the norm within other large firms with legacy architectures) where teams are talking about the same information but managing it in different ways. They may have different systems in place to manage that data or even have different lists of data available to them. Davies and his team are therefore aiming to find instances where the risk, finance and operations teams can all use the same data in the same format. The team will then determine how to manage that process and ensure that data quality levels remain the same over time.

All of this is aimed at enhancing the data underlying functions such as risk management across RBS. “Banks are increasingly trying to find new and better ways to manage risk such as getting quicker and better answers on sector concentrations. There is an appetite within risk to get those figures quickly and right first time. Also, single name exposure is a focus: understanding exposure to a single company or a group. This is often a difficult thing to do because it relies on accurate hierarchy information, which can be difficult to manage,” explains Davies. All of this necessarily entails the monitoring of data quality across the downstream systems involved.

Davies elaborates: “We are looking at how data is maintained and system feeds are being used by different parts of the same bank – are they getting their data from the same sources and using it in the same ways? Have they got the same definitions for ownership?”

The firm is currently only a few months into the One Risk project, which is being managed by the BAU team, and there is a lot more still to be done. There is a huge amount of analysis to get through in order to determine where it is appropriate for the firm’s user groups to have a single shared view of the underlying reference data.

These data quality checks should also help RBS to avoid infractions such as the one that cost it £5.6 million last month, by having a better handle on data management practices across the group. The UK Financial Services Authority’s (FSA) fined RBS Group at the start of August for the group’s failure to have adequate data checks in place to conduct mandatory anti-money laundering (AML) screening.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Detecting and preventing market abuse

Market abuse – unlawful disclosure of inside information, insider trading, circular trading, “pump and dump” schemes, etc. – poses significant threats to the integrity of capital markets. In 2024, global trading house Trafigura agreed to pay a $55 million fine to the U.S. Commodity Futures Trading Commission (CFTC) for trading with non-public information, manipulating a...

BLOG

Fenergo Enhances Financial Crime Compliance Capabilities with Agentic AI Integration

Fenergo has introduced an updated financial crime solution – the FinCrime Operating System (FinCrime OS) – featuring a new agentic AI layer aimed at significantly improving operational efficiency within financial institutions. This development comes against a background of spiralling operational costs and rising compliance demands enhanced by geopolitical tension and regulatory flux. Marc Murphy, CEO,...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...