About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Case Study: Standard Bank Centralises Client Information

Subscribe to our newsletter

South Africa’s Standard Bank is expanding its international franchise, and in order to support its international efforts, the bank has reengineered its reference data operations. A key project undertaken over the past year has been to put together a global Client Information File (CIF) in order to provide a single, global view of its clients across the wholesale bank.

Steve Spark, who is running the client information project at Standard Bank, spoke of the aim and progress of the project as well as the issues they faced along the way, at the recent Azdex event.

The project included defining the scope of the CIF system, as well as putting in place the processes and procedures to ensure that the data quality could be pro-actively maintained. Fortunately, the group have a very supportive chief executive. Spark said, “Senior management are now asking about core client data, which is a significant change over last year.”

The first decision made was that a central system to feed target systems in its Johannesburg base as well as London and Hong Kong, was needed. It was also decided that the CIF must be kept as a thin application, which currently supports approximately 30 key core client data fields to support client identification. Said Spark, “The aim of the system is to uniquely identify clients. We do receive internal requests for more data to be stored in the system, which we evaluate but if it does not help to uniquely identify clients then we have to turn them down.”

Some representatives from larger or more global institutions said that such a thin approach could not work for their operations given their wider instrument and market segment coverage.

Undertaken next was a cleansing of Standard Bank’s data files in order to create the single list of global clients. Said Spark, “We had a lot of duplicates to cleanse as our international records would often overlap with our local South African records”. In this endeavour, Standard Bank worked with Azdex to de-duplicate its records. They now total 27,500 records globally held in the central database.

Standard Bank soon recognized the need for regional focus to cater for country-specific data. Spark said, “Partly due to cost and resource issues, but also because of language difficulties in regions such as Hong Kong, we realised that we couldn’t do all the data cleansing and quality analysis from the central location of Johannesburg.”

The team is now working on legal hierarchies, but here, “we continue to struggle,” said Spark. Particularly in South Africa, he says, where there is a lot of acquisition activity and so hierarchies can be difficult.

To support ongoing maintenance of the core client data, a Central Data Group (CDG) with members in each location, has been created. The group is now in the process of “implementing a robust mechanism to publish data to its downstream systems”. These systems – there are currently six on board, the rest will be added over the next two years – will ‘subscribe’ to various data fields of relevance to their processes.

They have put in place daily reporting to measure how many records have changed, and they ensure they adhere to standards and have automated checks as much as possible. There is also a paper trail to see what analysts have changed, information that the bank is considering imaging to put into an online archive.

Other measures such as hard-coding the cities, countries and regions help to maintain data integrity. The processes and procedures are continually updated in Standard Bank’s CIF ‘bible’ of documentation.

There have been compromises along the way, however, in order to cater for operational practicalities, said Spark. “CIF is not completely thin as we have added some non-core data, such as the KYC status, client roles, and system mapping table”. The system mapping table maps all the systems that a client is subscribed to.

In terms of the lessons learned from their initiative, Spark emphasised that “people are the single biggest success factor,” an element that was initially underestimated. “Our mistake was not to realise that this job required more than an inexperienced person just capturing data from the web.” Now the bank looks for people with enquiring minds that will delve deeper into changes and ensure the data is fully verified.

It has also changed its approach from looking for volume in data cleansing, which it realised was the wrong incentive. “Its quality not quantity that matters. We estimated record cleansing would take between 30-40 minutes, but the truth is closer to 60-90 minutes per record, a lot of which is taken up by researching the hierarchy.” He also estimates that it costs around £15 per record to cleanse, not an insignificant amount. And “data changes faster than you would imagine”.

The team structure is also important, where there should be a clear distinction and timing between the roles of cleansing, verification, and authorisation. And as many reference data experts will agree, technology is just a minor component of the effort. Said Spark, “processes and procedures are key and the change management effort is significant”. He warned that anyone embarking on such a project “Will underestimate the data cleansing effort!”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

Clearwater Looking to Bridge Front-to-Back Office Tech Gaps with Acquisitions

It’s difficult for data and technology companies to fully service financial institutions’ front-to-back operations when behemoth providers are offering closely integrated capabilities at scale already. Clearwater Analytics, however, has a strategy that it believes will work not by necessarily competing with the big aggregators, but by working with them and filling gaps that they don’t...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...