About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Vendor Selection is Key to Bringing Down Costs But Also Improving Quality, Says SGSS’s Rose

Subscribe to our newsletter

In the current cost conscious environment, financial institutions need to push back on their vendors and make sure they have conducted a thorough evaluation of the data they are receiving from them, says Olivier Rose, head of projects and international market data management at French bank Société Générale Securities Services (SGSS). Rose, who has been involved in the recent negotiations with Bloomberg regarding service provider agreement (SPA) contracts, explains that SGSS currently has 12 data providers, all of which were selected due to their reliability rather than just on a cost basis.

The French headquartered bank has increased the number of data feeds it receives significantly over the last eight years, says Rose, who has worked for SGSS for the last five years. The bank has also moved to a new system from SunGard to support its fund administration division. “We now have three primary data feeds, have increased our budget for data management by five times and have gone from two providers to 12,” he elaborates.

SGSS is currently present in six countries (France, Luxembourg, Spain, Italy, Germany and UK and Ireland) and has been forced to increase its data management budget over recent years in order to accommodate the increase in the number of securities with which it deals. Over the last six years, the firm has moved from dealing with 22,000 securities to around 100,000 and from dealing with one million data fields to 100 million, continues Rose.

“Getting the data is relatively cheap compared to the process of cleaning the data up in order to able to use it,” he explains. “Budget declines have meant that no one wants to pay for that extra data or the extra processes required to improve data quality.”

The data departments of financial institutions such as SGSS are therefore being challenged to do more with smaller budgets and deal with the fallout of intense M&A activity that has left firms with even greater silos to overcome. There is also pressure to deal with the gaps in analytics that have emerged as a result of a greater focus on risk management and new compliance requirements from the regulatory community.

Rose reckons there is little to no quality in the data firms receive from vendors: “ISIN codes seem to require the least amount of data scrubbing but even they can be wrong. Once firms understand that there is no quality in this data then they can work together to tackle the problem.”

Controls and processes are needed to ensure data quality and this includes evaluations of vendors’ data coverage and reliability. “In meeting this challenge, banks need to recognise that data and not IT is the greatest asset,” says Rose. “Firms need to work closer with the end client to identify their data needs and get to know their providers. Renegotiation of contracts is not enough, first firms need to define how they work with these players and develop an organisational approach.” This needs to take into account how end clients are charged for data, as traditionally clients have not seen the cost of data management being included in firms’ pricing structures.

This has been an issue in the recent negotiations with Bloomberg, which was planning to introduce a new SPA contract structure at the start of this year. The changes would have meant fund administrators would need to pay for end client usage of the data rather than just for their own usage. This would significantly increase the costs for fund administrators but would be difficult to pass on to end clients, according to Cossiom.

Data vendors are certainly in for a tough time again this year. Rose suggests that rather than KYC, the focus for this year should be “know your provider”, including their coverage, solutions proposals and general market opinion on the vendor. “There are ratings for issuers, why not for providers?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for creating an effective data quality control framework

Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework that includes an automated and systematic process that monitors the state of data quality and ensures...

BLOG

Alveo Integrates MSCI ESG Research Content into DaaS Data Management Solution

Alveo has integrated MSCI ESG Research content into its Data-as-a-Service (DaaS) data management solution, giving clients access to both firms’ data and cloud data management capabilities, and providing solutions that integrate MSCI’s content into client workflows and help business users self-serve. Integrated MSCI content includes MSCI ESG Ratings, ESG Controversies and ESG Sustainable Impact Metrics,...

EVENT

FinCrime Tech Briefing, London

RegTech Insight (from A-Team Group) is proud to announce the launch of its FinCrime Tech Briefing taking place in both London and New York this summer and focusing on RegTech for AML and Financial Crime Compliance.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...