The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS is in Process of Re-architecting Data Infrastructure and is Focusing on Golden Sources, Says Bishop

Following its merger with ABN Amro, the Royal Bank of Scotland (RBS) has finally dragged all of its data processes together under one structure, according to Bob Bishop, the bank’s head of client data management. Now the “lift and drop” has been completed, the focus of the data management team is now on re-architecting the data management structure to realise efficiencies and defining golden sources of client data, he explains.

“The focus is most definitely on golden sources and we are seeking to mandate these for various areas within the bank,” says Bishop. “It has been agreed between operations, risk, credit and finance that a mandate and governance agreements are required in order to implement this.”

The integration process between the two merged banks’ data infrastructures is underway and the initial focus was on enabling RBS to trade with ABN’s clients. The next item on the agenda is rationalising the data processes and systems in which all this client data is stored, says Bishop.

As reference data is viewed as a control function via which to measure risk, the data management team sits within the operational risk control function, he elaborates. “This means that we have a mandate to put in place operations staff to act as data stewards for all data, including risk, operations, credit and finance data,” he says. This senior level sponsorship and mandate has meant the data team has a fair amount of clout across the firm as a whole in terms of kicking off data integration projects.

Much like the rest of the market, RBS is also evaluating its data vendors at the moment. The bank currently uses around 200-300 data vendor feeds including those direct from local exchanges and local vendors and is looking to rationalise that number to some extent. “One vendor will never satisfy all your data requirements because there will always be vendors that are better at particular areas such as corporate actions or counterparty data,” he adds.

Bishop is also hopeful that some progress will be made in the industry as a whole with regards to data standardisation, which he feels should be led by political will and a regulatory mandate. “There is a need for an industry code of conduct that deals with the basics of reference data standardisation. It should be high level and not too prescriptive,” he concludes.

Related content

WEBINAR

Recorded Webinar: The post-Brexit UK sanctions regime – how to stay safe and compliant

When the Brexit transition period came to an end on 31 December 2020, a new sanctions regime was introduced in the UK under legislation set out in the Sanctions and Anti-Money Laundering Act 2018 (aka the Sanctions Act). The regime is fundamentally different to that of the EU, requiring financial institutions to rethink their response...

BLOG

SETL Launches Verafide Digital Credentials Platform for KYC, Other Verification Challenges

Blockchain operator SETL’s Verafide subsidiary has launched its open-source platform for verifying digital credentials and identification on the SETL enterprise blockchain, allowing issuers, holders and verifiers to set up and maintain a digital credentials ecosystem. The Verafide platform is aimed at helping firms fulfill their KYC obligations, adopt more seamless and transparent customer onboarding processes...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...