About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Vendor Selection is Key to Bringing Down Costs But Also Improving Quality, Says SGSS’s Rose

Subscribe to our newsletter

In the current cost conscious environment, financial institutions need to push back on their vendors and make sure they have conducted a thorough evaluation of the data they are receiving from them, says Olivier Rose, head of projects and international market data management at French bank Société Générale Securities Services (SGSS). Rose, who has been involved in the recent negotiations with Bloomberg regarding service provider agreement (SPA) contracts, explains that SGSS currently has 12 data providers, all of which were selected due to their reliability rather than just on a cost basis.

The French headquartered bank has increased the number of data feeds it receives significantly over the last eight years, says Rose, who has worked for SGSS for the last five years. The bank has also moved to a new system from SunGard to support its fund administration division. “We now have three primary data feeds, have increased our budget for data management by five times and have gone from two providers to 12,” he elaborates.

SGSS is currently present in six countries (France, Luxembourg, Spain, Italy, Germany and UK and Ireland) and has been forced to increase its data management budget over recent years in order to accommodate the increase in the number of securities with which it deals. Over the last six years, the firm has moved from dealing with 22,000 securities to around 100,000 and from dealing with one million data fields to 100 million, continues Rose.

“Getting the data is relatively cheap compared to the process of cleaning the data up in order to able to use it,” he explains. “Budget declines have meant that no one wants to pay for that extra data or the extra processes required to improve data quality.”

The data departments of financial institutions such as SGSS are therefore being challenged to do more with smaller budgets and deal with the fallout of intense M&A activity that has left firms with even greater silos to overcome. There is also pressure to deal with the gaps in analytics that have emerged as a result of a greater focus on risk management and new compliance requirements from the regulatory community.

Rose reckons there is little to no quality in the data firms receive from vendors: “ISIN codes seem to require the least amount of data scrubbing but even they can be wrong. Once firms understand that there is no quality in this data then they can work together to tackle the problem.”

Controls and processes are needed to ensure data quality and this includes evaluations of vendors’ data coverage and reliability. “In meeting this challenge, banks need to recognise that data and not IT is the greatest asset,” says Rose. “Firms need to work closer with the end client to identify their data needs and get to know their providers. Renegotiation of contracts is not enough, first firms need to define how they work with these players and develop an organisational approach.” This needs to take into account how end clients are charged for data, as traditionally clients have not seen the cost of data management being included in firms’ pricing structures.

This has been an issue in the recent negotiations with Bloomberg, which was planning to introduce a new SPA contract structure at the start of this year. The changes would have meant fund administrators would need to pay for end client usage of the data rather than just for their own usage. This would significantly increase the costs for fund administrators but would be difficult to pass on to end clients, according to Cossiom.

Data vendors are certainly in for a tough time again this year. Rose suggests that rather than KYC, the focus for this year should be “know your provider”, including their coverage, solutions proposals and general market opinion on the vendor. “There are ratings for issuers, why not for providers?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Entrust in Talks to Acquire AI-Powered Identity Verification Specialist Onfido

Entrust, a provider of trusted identities, payments, and data security, is in exclusive discussions to acquire London-based Onfido, a provider of cloud-based, AI-powered identity verification (IDV) technology. If the acquisition completes, Entrust would add a compliant AI/ML-based biometric and document IDV tech stack to its portfolio of identity solutions. It would also have an opportunity...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...