In the current cost conscious environment, financial institutions need to push back on their vendors and make sure they have conducted a thorough evaluation of the data they are receiving from them, says Olivier Rose, head of projects and international market data management at French bank Société Générale Securities Services (SGSS). Rose, who has been involved in the recent negotiations with Bloomberg regarding service provider agreement (SPA) contracts, explains that SGSS currently has 12 data providers, all of which were selected due to their reliability rather than just on a cost basis.
The French headquartered bank has increased the number of data feeds it receives significantly over the last eight years, says Rose, who has worked for SGSS for the last five years. The bank has also moved to a new system from SunGard to support its fund administration division. “We now have three primary data feeds, have increased our budget for data management by five times and have gone from two providers to 12,” he elaborates.
SGSS is currently present in six countries (France, Luxembourg, Spain, Italy, Germany and UK and Ireland) and has been forced to increase its data management budget over recent years in order to accommodate the increase in the number of securities with which it deals. Over the last six years, the firm has moved from dealing with 22,000 securities to around 100,000 and from dealing with one million data fields to 100 million, continues Rose.
“Getting the data is relatively cheap compared to the process of cleaning the data up in order to able to use it,” he explains. “Budget declines have meant that no one wants to pay for that extra data or the extra processes required to improve data quality.”
The data departments of financial institutions such as SGSS are therefore being challenged to do more with smaller budgets and deal with the fallout of intense M&A activity that has left firms with even greater silos to overcome. There is also pressure to deal with the gaps in analytics that have emerged as a result of a greater focus on risk management and new compliance requirements from the regulatory community.
Rose reckons there is little to no quality in the data firms receive from vendors: “ISIN codes seem to require the least amount of data scrubbing but even they can be wrong. Once firms understand that there is no quality in this data then they can work together to tackle the problem.”
Controls and processes are needed to ensure data quality and this includes evaluations of vendors’ data coverage and reliability. “In meeting this challenge, banks need to recognise that data and not IT is the greatest asset,” says Rose. “Firms need to work closer with the end client to identify their data needs and get to know their providers. Renegotiation of contracts is not enough, first firms need to define how they work with these players and develop an organisational approach.” This needs to take into account how end clients are charged for data, as traditionally clients have not seen the cost of data management being included in firms’ pricing structures.
This has been an issue in the recent negotiations with Bloomberg, which was planning to introduce a new SPA contract structure at the start of this year. The changes would have meant fund administrators would need to pay for end client usage of the data rather than just for their own usage. This would significantly increase the costs for fund administrators but would be difficult to pass on to end clients, according to Cossiom.
Data vendors are certainly in for a tough time again this year. Rose suggests that rather than KYC, the focus for this year should be “know your provider”, including their coverage, solutions proposals and general market opinion on the vendor. “There are ratings for issuers, why not for providers?”
Subscribe to our newsletter