About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Vendor Selection is Key to Bringing Down Costs But Also Improving Quality, Says SGSS’s Rose

Subscribe to our newsletter

In the current cost conscious environment, financial institutions need to push back on their vendors and make sure they have conducted a thorough evaluation of the data they are receiving from them, says Olivier Rose, head of projects and international market data management at French bank Société Générale Securities Services (SGSS). Rose, who has been involved in the recent negotiations with Bloomberg regarding service provider agreement (SPA) contracts, explains that SGSS currently has 12 data providers, all of which were selected due to their reliability rather than just on a cost basis.

The French headquartered bank has increased the number of data feeds it receives significantly over the last eight years, says Rose, who has worked for SGSS for the last five years. The bank has also moved to a new system from SunGard to support its fund administration division. “We now have three primary data feeds, have increased our budget for data management by five times and have gone from two providers to 12,” he elaborates.

SGSS is currently present in six countries (France, Luxembourg, Spain, Italy, Germany and UK and Ireland) and has been forced to increase its data management budget over recent years in order to accommodate the increase in the number of securities with which it deals. Over the last six years, the firm has moved from dealing with 22,000 securities to around 100,000 and from dealing with one million data fields to 100 million, continues Rose.

“Getting the data is relatively cheap compared to the process of cleaning the data up in order to able to use it,” he explains. “Budget declines have meant that no one wants to pay for that extra data or the extra processes required to improve data quality.”

The data departments of financial institutions such as SGSS are therefore being challenged to do more with smaller budgets and deal with the fallout of intense M&A activity that has left firms with even greater silos to overcome. There is also pressure to deal with the gaps in analytics that have emerged as a result of a greater focus on risk management and new compliance requirements from the regulatory community.

Rose reckons there is little to no quality in the data firms receive from vendors: “ISIN codes seem to require the least amount of data scrubbing but even they can be wrong. Once firms understand that there is no quality in this data then they can work together to tackle the problem.”

Controls and processes are needed to ensure data quality and this includes evaluations of vendors’ data coverage and reliability. “In meeting this challenge, banks need to recognise that data and not IT is the greatest asset,” says Rose. “Firms need to work closer with the end client to identify their data needs and get to know their providers. Renegotiation of contracts is not enough, first firms need to define how they work with these players and develop an organisational approach.” This needs to take into account how end clients are charged for data, as traditionally clients have not seen the cost of data management being included in firms’ pricing structures.

This has been an issue in the recent negotiations with Bloomberg, which was planning to introduce a new SPA contract structure at the start of this year. The changes would have meant fund administrators would need to pay for end client usage of the data rather than just for their own usage. This would significantly increase the costs for fund administrators but would be difficult to pass on to end clients, according to Cossiom.

Data vendors are certainly in for a tough time again this year. Rose suggests that rather than KYC, the focus for this year should be “know your provider”, including their coverage, solutions proposals and general market opinion on the vendor. “There are ratings for issuers, why not for providers?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Webinar Preview: Buy-Side Best Practices to Navigate a Challenging Data Landscape

The buy side is facing fresh challenges as rapid digitalisation, shifting geopolitics and economic uncertainty force investors to evolve their investment strategies. While firms are turning to data to make sense of, and react to, these new volatilities, that surge of information is posing its own set of challenges too, particularly on how to manage...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...