About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

RBS is in Process of Re-architecting Data Infrastructure and is Focusing on Golden Sources, Says Bishop

Subscribe to our newsletter

Following its merger with ABN Amro, the Royal Bank of Scotland (RBS) has finally dragged all of its data processes together under one structure, according to Bob Bishop, the bank’s head of client data management. Now the “lift and drop” has been completed, the focus of the data management team is now on re-architecting the data management structure to realise efficiencies and defining golden sources of client data, he explains.

“The focus is most definitely on golden sources and we are seeking to mandate these for various areas within the bank,” says Bishop. “It has been agreed between operations, risk, credit and finance that a mandate and governance agreements are required in order to implement this.”

The integration process between the two merged banks’ data infrastructures is underway and the initial focus was on enabling RBS to trade with ABN’s clients. The next item on the agenda is rationalising the data processes and systems in which all this client data is stored, says Bishop.

As reference data is viewed as a control function via which to measure risk, the data management team sits within the operational risk control function, he elaborates. “This means that we have a mandate to put in place operations staff to act as data stewards for all data, including risk, operations, credit and finance data,” he says. This senior level sponsorship and mandate has meant the data team has a fair amount of clout across the firm as a whole in terms of kicking off data integration projects.

Much like the rest of the market, RBS is also evaluating its data vendors at the moment. The bank currently uses around 200-300 data vendor feeds including those direct from local exchanges and local vendors and is looking to rationalise that number to some extent. “One vendor will never satisfy all your data requirements because there will always be vendors that are better at particular areas such as corporate actions or counterparty data,” he adds.

Bishop is also hopeful that some progress will be made in the industry as a whole with regards to data standardisation, which he feels should be led by political will and a regulatory mandate. “There is a need for an industry code of conduct that deals with the basics of reference data standardisation. It should be high level and not too prescriptive,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Most City Mega Mergers Test Tech More Than Balance Sheets

By Gus Sekhon, head of product, FINBOURNE Technology. The City loves nothing more than a takeover tale as old as time. A US$2.5tn US asset management behemoth snapping up one of London’s most historic investment houses for £10bn sounds like a story of global ambition and deep pockets. The Schroders brand stays, the headquarters remains...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...