About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Vendor Selection is Key to Bringing Down Costs But Also Improving Quality, Says SGSS’s Rose

Subscribe to our newsletter

In the current cost conscious environment, financial institutions need to push back on their vendors and make sure they have conducted a thorough evaluation of the data they are receiving from them, says Olivier Rose, head of projects and international market data management at French bank Société Générale Securities Services (SGSS). Rose, who has been involved in the recent negotiations with Bloomberg regarding service provider agreement (SPA) contracts, explains that SGSS currently has 12 data providers, all of which were selected due to their reliability rather than just on a cost basis.

The French headquartered bank has increased the number of data feeds it receives significantly over the last eight years, says Rose, who has worked for SGSS for the last five years. The bank has also moved to a new system from SunGard to support its fund administration division. “We now have three primary data feeds, have increased our budget for data management by five times and have gone from two providers to 12,” he elaborates.

SGSS is currently present in six countries (France, Luxembourg, Spain, Italy, Germany and UK and Ireland) and has been forced to increase its data management budget over recent years in order to accommodate the increase in the number of securities with which it deals. Over the last six years, the firm has moved from dealing with 22,000 securities to around 100,000 and from dealing with one million data fields to 100 million, continues Rose.

“Getting the data is relatively cheap compared to the process of cleaning the data up in order to able to use it,” he explains. “Budget declines have meant that no one wants to pay for that extra data or the extra processes required to improve data quality.”

The data departments of financial institutions such as SGSS are therefore being challenged to do more with smaller budgets and deal with the fallout of intense M&A activity that has left firms with even greater silos to overcome. There is also pressure to deal with the gaps in analytics that have emerged as a result of a greater focus on risk management and new compliance requirements from the regulatory community.

Rose reckons there is little to no quality in the data firms receive from vendors: “ISIN codes seem to require the least amount of data scrubbing but even they can be wrong. Once firms understand that there is no quality in this data then they can work together to tackle the problem.”

Controls and processes are needed to ensure data quality and this includes evaluations of vendors’ data coverage and reliability. “In meeting this challenge, banks need to recognise that data and not IT is the greatest asset,” says Rose. “Firms need to work closer with the end client to identify their data needs and get to know their providers. Renegotiation of contracts is not enough, first firms need to define how they work with these players and develop an organisational approach.” This needs to take into account how end clients are charged for data, as traditionally clients have not seen the cost of data management being included in firms’ pricing structures.

This has been an issue in the recent negotiations with Bloomberg, which was planning to introduce a new SPA contract structure at the start of this year. The changes would have meant fund administrators would need to pay for end client usage of the data rather than just for their own usage. This would significantly increase the costs for fund administrators but would be difficult to pass on to end clients, according to Cossiom.

Data vendors are certainly in for a tough time again this year. Rose suggests that rather than KYC, the focus for this year should be “know your provider”, including their coverage, solutions proposals and general market opinion on the vendor. “There are ratings for issuers, why not for providers?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

A-Team Group Announces Winners of the 2025 RegTech Insight Awards (USA)

A-Team Group is delighted to announce the winners of the 2025 RegTech Insight Awards USA, recognising the leading providers of RegTech solutions, and consultancy services for capital markets across North America. Spanning more than 30 categories, the 2025 awards programme recognised excellence across a wide range of regulatory compliance solutions and services. A-Team Group also presented...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...