About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Progress can be made when client firms stand up and speak

Subscribe to our newsletter

Understanding the issues that clients are facing – and how they are going about handling them – gives us real insight into the progress the reference data industry is making. And by identifying trouble spots we can work to make further progress.

This is why gatherings of practitioners can be a very useful exercise. It’s true that some gatherings can be disregarded as feel-good events with little real substance, usually because people are too aware of the competitive pressures to be truly honest. But there are events that manage to move past this and present a true picture of the real issues being faced by those grappling with data problems.

As it was with the Azdex event held a few weeks ago in London during which representatives from Barclays Capital, Citigroup, Credit Suisse First Boston, HBOS, and Standard Bank spoke candidly about their internal missions to manage client entity data.

As you’ll see from our case studies in this issue, CSFB has taken a distributed approach to the responsibility for the quality of its client entity data given the scale of their databases. By contrast, Standard Bank has taken a centralised approach, pulling all data into one central Client Information File with a unified team of data analysts responsible for its maintenance.

Each example has its own merits and drawbacks, which were honestly discussed by the data managers. For example, CSFB has eradicated multiple changes to the same data item across disparate databases – the multiple databases still exist, but a data attribute is changed in one place only and pumped out to the other databases. The difficulty in this approach has been the ongoing identification of the data owners who ‘care’ enough about any given attribute in order to maintain it. Meanwhile, Standard Bank has reduced duplication of data, but found that a much higher quality of data analyst was required than they originally thought.

In both cases a strategic approach was taken and both have the essential senior management support needed to pull off such large-scale projects.

We’re sure that many attendees learned from hearing about their experiences and can apply that knowledge to their own data management initiatives. If only more client firms could be persuaded to share so openly, the reference data industry could really make progress. After all, the issues are for the most part basic, even mundane, issues that are holding back regulatory compliance, operational efficiencies and straight through processing across the industry.

Surely financial institutions should be focusing on their real business of serving their clients, managing investments, product innovation and other valuable aspects rather than the administrative side. By working together and sharing experiences, all firms could move faster towards making the administration layer almost invisible rather than trying to use it to gain a (temporary) competitive edge. We admit this is a somewhat utopian view of the data world. The very people we’re suggesting should stand up and help others learn from their experience to eradicate data problems are the very people that would then have no job if this succeeded. And of course, we at Reference Data Review might have to find another topic to become engrossed in.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Date: 19 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become...

BLOG

The Challenge of Data Integration in a Multiple Data Source World

By Inesa Smigola, Head of Presales, EMEA and APAC at Xceptor. Financial institutions have a growing data challenge – ever increasing data volumes, much of it unstructured, multiple data sources, and hugely varied data formats and structures. Across this is the additional challenge of inconsistent data quality according to data source and format– an Excel...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...