About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS Focuses on Data Quality in Order to Better Support its Risk Function

Subscribe to our newsletter

In addition to its entity data management project, the Royal Bank of Scotland (RBS) is also working to improve the overall quality of its reference data as part of its One Risk project. Mark Davies, head of reference data at RBS, explains to Reference Data Review that the firm’s risk function prompted the move, which is initially focused on understanding how reference data is maintained and system feeds are being used by different parts of the firm.

“A lot of the focus on data quality within RBS this year has been driven by the risk users,” explains Davies. “There is a huge amount of activity going on around improving data quality for the risk community, as well as our legal entity work.” The legal entity work Davies is referring to is the project that has been focused on decommissioning the legal entity database acquired during the ABN Amro buyout and moving that data onto its centralised RBS system. It has thus far involved the RBS team taking a database that started out at around 500,000 legal entities and which is now made up of the best part of 1.5 million

The risk project, on the other hand, is much more wide ranging in its focus than the legal entity work, which is focused solely on RBS’ wholesale entity data management system. “The One Risk project means we are working with both group and division level teams and we are looking to see where we can standardise,” explains Davies.

He notes that there are currently instances across the firm (as is the norm within other large firms with legacy architectures) where teams are talking about the same information but managing it in different ways. They may have different systems in place to manage that data or even have different lists of data available to them. Davies and his team are therefore aiming to find instances where the risk, finance and operations teams can all use the same data in the same format. The team will then determine how to manage that process and ensure that data quality levels remain the same over time.

All of this is aimed at enhancing the data underlying functions such as risk management across RBS. “Banks are increasingly trying to find new and better ways to manage risk such as getting quicker and better answers on sector concentrations. There is an appetite within risk to get those figures quickly and right first time. Also, single name exposure is a focus: understanding exposure to a single company or a group. This is often a difficult thing to do because it relies on accurate hierarchy information, which can be difficult to manage,” explains Davies. All of this necessarily entails the monitoring of data quality across the downstream systems involved.

Davies elaborates: “We are looking at how data is maintained and system feeds are being used by different parts of the same bank – are they getting their data from the same sources and using it in the same ways? Have they got the same definitions for ownership?”

The firm is currently only a few months into the One Risk project, which is being managed by the BAU team, and there is a lot more still to be done. There is a huge amount of analysis to get through in order to determine where it is appropriate for the firm’s user groups to have a single shared view of the underlying reference data.

These data quality checks should also help RBS to avoid infractions such as the one that cost it £5.6 million last month, by having a better handle on data management practices across the group. The UK Financial Services Authority’s (FSA) fined RBS Group at the start of August for the group’s failure to have adequate data checks in place to conduct mandatory anti-money laundering (AML) screening.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The art of implementing data lineage

Data lineage is a regulatory and internal requirement with potential to deliver significant operational and business benefits, but financial institutions can find it difficult to implement and complex to maintain as systems and regulatory requirements themselves, change quickly. The importance of understanding where the true source of the data is coming from, where the data...

BLOG

ABBYY Q&A: Bringing Intelligence to Document Processing

ABBYY is an artificial intelligence-powered document processing tools provider that was formed in Soviet Russia in 1989 and relocated to the US nine years later. Data Management Insight caught up with chief executive Ulf Persson to find out more about the company and its plans. Data Management Insight: Hello Ulf, how was ABBYY begun and...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Regulatory Reporting Handbook – First Edition

Welcome to the inaugural edition of A-Team Group’s Regulatory Reporting Handbook, a comprehensive guide to reporting obligations that must be fulfilled by financial institutions on a global basis. The handbook reviews not only the current state of play within the regulatory reporting space, but also looks ahead to identify how institutions should be preparing for...