About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS Focuses on Data Quality in Order to Better Support its Risk Function

Subscribe to our newsletter

In addition to its entity data management project, the Royal Bank of Scotland (RBS) is also working to improve the overall quality of its reference data as part of its One Risk project. Mark Davies, head of reference data at RBS, explains to Reference Data Review that the firm’s risk function prompted the move, which is initially focused on understanding how reference data is maintained and system feeds are being used by different parts of the firm.

“A lot of the focus on data quality within RBS this year has been driven by the risk users,” explains Davies. “There is a huge amount of activity going on around improving data quality for the risk community, as well as our legal entity work.” The legal entity work Davies is referring to is the project that has been focused on decommissioning the legal entity database acquired during the ABN Amro buyout and moving that data onto its centralised RBS system. It has thus far involved the RBS team taking a database that started out at around 500,000 legal entities and which is now made up of the best part of 1.5 million

The risk project, on the other hand, is much more wide ranging in its focus than the legal entity work, which is focused solely on RBS’ wholesale entity data management system. “The One Risk project means we are working with both group and division level teams and we are looking to see where we can standardise,” explains Davies.

He notes that there are currently instances across the firm (as is the norm within other large firms with legacy architectures) where teams are talking about the same information but managing it in different ways. They may have different systems in place to manage that data or even have different lists of data available to them. Davies and his team are therefore aiming to find instances where the risk, finance and operations teams can all use the same data in the same format. The team will then determine how to manage that process and ensure that data quality levels remain the same over time.

All of this is aimed at enhancing the data underlying functions such as risk management across RBS. “Banks are increasingly trying to find new and better ways to manage risk such as getting quicker and better answers on sector concentrations. There is an appetite within risk to get those figures quickly and right first time. Also, single name exposure is a focus: understanding exposure to a single company or a group. This is often a difficult thing to do because it relies on accurate hierarchy information, which can be difficult to manage,” explains Davies. All of this necessarily entails the monitoring of data quality across the downstream systems involved.

Davies elaborates: “We are looking at how data is maintained and system feeds are being used by different parts of the same bank – are they getting their data from the same sources and using it in the same ways? Have they got the same definitions for ownership?”

The firm is currently only a few months into the One Risk project, which is being managed by the BAU team, and there is a lot more still to be done. There is a huge amount of analysis to get through in order to determine where it is appropriate for the firm’s user groups to have a single shared view of the underlying reference data.

These data quality checks should also help RBS to avoid infractions such as the one that cost it £5.6 million last month, by having a better handle on data management practices across the group. The UK Financial Services Authority’s (FSA) fined RBS Group at the start of August for the group’s failure to have adequate data checks in place to conduct mandatory anti-money laundering (AML) screening.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Taking a holistic approach to buy-side data management

As data volumes and complexity continue to increase, buy-side data management is at an inflection point. Manual processes and data siloes are no longer fit for purpose, and firms need to take a more holistic approach to data management that not only reduces manual intervention and cost, but also increases data access and accuracy for...

BLOG

GLEIF Notes Growth in LEI Numbers, Identifies New Use Cases Including Payments

The Legal Entity Identifier (LEI) is alive and well, with numbers of active LEIs growing through 2022 despite no additional regulatory mandates, and new use cases of the identifier including cross-border payments expected to further expand not only numbers, but also acceptance and adoption of the standard. Data Management Insight recently talked to Stephan Wolf,...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...