About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Case Study: Standard Bank Centralises Client Information

Subscribe to our newsletter

South Africa’s Standard Bank is expanding its international franchise, and in order to support its international efforts, the bank has reengineered its reference data operations. A key project undertaken over the past year has been to put together a global Client Information File (CIF) in order to provide a single, global view of its clients across the wholesale bank.

Steve Spark, who is running the client information project at Standard Bank, spoke of the aim and progress of the project as well as the issues they faced along the way, at the recent Azdex event.

The project included defining the scope of the CIF system, as well as putting in place the processes and procedures to ensure that the data quality could be pro-actively maintained. Fortunately, the group have a very supportive chief executive. Spark said, “Senior management are now asking about core client data, which is a significant change over last year.”

The first decision made was that a central system to feed target systems in its Johannesburg base as well as London and Hong Kong, was needed. It was also decided that the CIF must be kept as a thin application, which currently supports approximately 30 key core client data fields to support client identification. Said Spark, “The aim of the system is to uniquely identify clients. We do receive internal requests for more data to be stored in the system, which we evaluate but if it does not help to uniquely identify clients then we have to turn them down.”

Some representatives from larger or more global institutions said that such a thin approach could not work for their operations given their wider instrument and market segment coverage.

Undertaken next was a cleansing of Standard Bank’s data files in order to create the single list of global clients. Said Spark, “We had a lot of duplicates to cleanse as our international records would often overlap with our local South African records”. In this endeavour, Standard Bank worked with Azdex to de-duplicate its records. They now total 27,500 records globally held in the central database.

Standard Bank soon recognized the need for regional focus to cater for country-specific data. Spark said, “Partly due to cost and resource issues, but also because of language difficulties in regions such as Hong Kong, we realised that we couldn’t do all the data cleansing and quality analysis from the central location of Johannesburg.”

The team is now working on legal hierarchies, but here, “we continue to struggle,” said Spark. Particularly in South Africa, he says, where there is a lot of acquisition activity and so hierarchies can be difficult.

To support ongoing maintenance of the core client data, a Central Data Group (CDG) with members in each location, has been created. The group is now in the process of “implementing a robust mechanism to publish data to its downstream systems”. These systems – there are currently six on board, the rest will be added over the next two years – will ‘subscribe’ to various data fields of relevance to their processes.

They have put in place daily reporting to measure how many records have changed, and they ensure they adhere to standards and have automated checks as much as possible. There is also a paper trail to see what analysts have changed, information that the bank is considering imaging to put into an online archive.

Other measures such as hard-coding the cities, countries and regions help to maintain data integrity. The processes and procedures are continually updated in Standard Bank’s CIF ‘bible’ of documentation.

There have been compromises along the way, however, in order to cater for operational practicalities, said Spark. “CIF is not completely thin as we have added some non-core data, such as the KYC status, client roles, and system mapping table”. The system mapping table maps all the systems that a client is subscribed to.

In terms of the lessons learned from their initiative, Spark emphasised that “people are the single biggest success factor,” an element that was initially underestimated. “Our mistake was not to realise that this job required more than an inexperienced person just capturing data from the web.” Now the bank looks for people with enquiring minds that will delve deeper into changes and ensure the data is fully verified.

It has also changed its approach from looking for volume in data cleansing, which it realised was the wrong incentive. “Its quality not quantity that matters. We estimated record cleansing would take between 30-40 minutes, but the truth is closer to 60-90 minutes per record, a lot of which is taken up by researching the hierarchy.” He also estimates that it costs around £15 per record to cleanse, not an insignificant amount. And “data changes faster than you would imagine”.

The team structure is also important, where there should be a clear distinction and timing between the roles of cleansing, verification, and authorisation. And as many reference data experts will agree, technology is just a minor component of the effort. Said Spark, “processes and procedures are key and the change management effort is significant”. He warned that anyone embarking on such a project “Will underestimate the data cleansing effort!”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New opportunities to scale data operations

Faced with tough competition and ongoing pressure on margins, many firms are reviewing their operating models and assessing whether they can reallocate more resources to high-value projects by outsourcing commoditised processes including data operations. This webinar will explore the different approaches that buy-side and sell-side firms are adopting to scale their data operations, including market...

BLOG

Snowflake Unifies Data and Extends Governance Capabilities in Data Cloud

Snowflake has eliminated data silos from Data Cloud by unifying data in a single data foundation that is secured by the company’s Horizon governance model. It has also enhanced Data Cloud to allow users to bring AI and app development directly to their enterprise data. “Snowflake’s latest innovations ensure that customers have the ability to...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...