As well as the discussions around data standards and utilities, keeping a closer eye on your vendors, finding data champions and internal governance strategies were recurring themes throughout this year’s FIMA conference in London. Speakers challenges delegates to ask more of their third party vendor providers and to identify data champions within their own institutions to push through change.
In order to better measure vendor performance and afford better control over vendor relationships, data managers should tailor their service level agreements (SLAs) to individual vendors and products, said Peter Largier, global head of reference data analysis at Credit Suisse, during his speech at FIMA 2009’s focus day. This process should begin with defining how critical their services are to your business and end with agreement on a detailed SLA, he explained.
Credit Suisse determines the criticality of vendor service provision by examining the potential impact of a failure within the vendor’s offering. “If the impact of a failure means that your core business is stopped as a result, then this is a high level risk and should be categorised as a level one vendor service,” said Largier. The bank uses three categories to determine this business impact and therefore the level of the vendor, with three being the least critical.
The endeavour should also include a review of what other vendors in the space provide and how complex a process it would be to move from one vendor to another, he added. “This depends on how integrated the system is into your infrastructure and has a significant impact on the cost of replacing a system with another,” he contended.
For new vendor products, all this work should be conducted before negotiations are begun, according to Largier. “Before negotiating you need to look at all aspects of the relationship including the costs of integration and implementation, as well as licensing.”
In terms of existing relationships, Largier suggested adding in an SLA at the time of renewal in order to set in place key performance indicators (KPIs) by which to judge the vendor’s service provision. “In the first year of the SLA there needs to be clear ownership of this relationship management within your firm and you need to nominate an individual or a team to measure performance against KPIs,” he said.
Largier did concede that tying a vendor down and keeping them to their promises is harder than it would first seem. “SLAs only work if you define the penalties for non-compliance first,” he added. He suggested therefore using penalties such as retaining a set percentage of the contract value if vendors renege on their promises, but did not, when questioned by an audience member, indicate whether Credit Suisse was in the practice of doing this often.
Ian Webster, global head of data management for UBS Global Asset Management, reprised this theme the next day in terms of internal clients and said that financial institutions are not challenging their existing data structures enough to improve their processes. “You need to get underneath what your client requirements are in order to develop a proper set of KPIs against which to measure your successes,” he elaborated.
“Firms need to be providing the right quality of data at the right time to their clients, both internal and external,” said Webster. “If your data quality dashboards are always green, you are measuring the wrong things; there is always room for improvement.”
He challenged delegates to set new time specific thresholds for their systems and to improve their efficiency in order to keep pace with the changing industry landscape. Webster also asked them to view operational risk as a friend in their organisation that can help them to build a business case for a data management project.
Reprising the theme from the focus day, Webster also suggested that firms should set up very specific SLAs to tie down their vendors. “This should include client specific SLAs and application specific SLAs in order to ensure the process is completely transparent so that downstream users are not completing duplicative checking procedures,” he added.
The main challenge is around not the technical model but the operational model in increasing coverage and content of a data management system, Webster said. “Also remember that not all data is created equally and some is more important than others to a firm’s operations,” he added.
His conclusion to the delegation was that they should look beyond what has been historically focused upon to improve operational processes around lowering costs, improving quality and coverage.
John White, global head of market and vended data services for State Street Global Advisors (SSGA), explained to delegates at FIMA 2009 how his firm has handled a review and improvement project for its market data provision. Key aspects to this process are examining the cost, quality and coverage of current provision and implementing structural and policy changes to fill in the gaps, he elaborated.
White contended that the industry is witnessing an increase in the level of interest around reference data projects in general and SSGA in particular has been focused on developing a new structure and a policy to obtain support for its own project. “The focus of the project has been on understanding the flow of data across our institution and developing a gap analysis for our own proprietary data and third party provided data,” said White.
He elaborated on a number of current challenges across the market in terms of tackling this space such as managing vendor relationships: “Do you have enough licences in place and do they cover the right things?” Yet again, cost containment cropped up in accordance with this kind of review and White elaborated on SSGA’s own challenge to keep costs down. The firm carried out a full review of their existing data structures and looked to see where they could rationalise their current data feeds.
As well as cost, people are a key consideration to take into account: “Do your key data people have enough expertise? Do you have executive support for your programmes?”
SLAs need to be tightened, said White on the recurring theme for this year’s event. “In terms of our own data structure, we noticed that a lot of data decisions were being made but not necessarily by the experts required to make them,” he said. “We decided to set in place a specific structure and policy to tackle this issue.”
The bank has coordinated these data functions under one umbrella and garnered executive sponsorship. The support of the risk management and compliance functions has been particularly helpful, he added.
White’s lessons learned from the process included getting executives on board early in the project, gaining a cost commitment at the outset and rolling out a holistic programme.
Turning to the other recurring theme, Sally Hinds, global head of EDM for HSBC, told FIMA delegates about the benefits of having a data champion within senior management to help a data management system overhaul get underway. When the investment bank launched its EDM effort in 2007, the team had the benefit of what Hinds described as a “data Rambo” to get it off to a strong start.
The original sponsor of the HSBC project was a very senior management team member and he reported directly into the board, said Hinds. This helped the project get off to a standing start. Hinds, who is in charge of static data maintenance for the investment banking division of HSBC and market data for the entire HSBC group, elaborated on the bank’s reference data journey over the last three years. The bank began the project by tackling the structured credit area and then moved on to set up two offshore centres in Bangalore to support London and New York. In 2008 the team was forced to open a centre in another country, Kuala Lumpur. “This was a real challenge for us as we had to set up the operating centre with no additional funding,” she explained.
The bank is now rationalising its various EDM teams as part of its One HSBC programme to bring together support for all of the various divisions of the bank into one central team, said Hinds. However, this has caused something of a culture clash between the teams, which means determining a way to work together is more challenging, she said.
Hinds and her fellow panellists debated the need for better communication between data management teams and with the end users of the data. Firms should not “short change” themselves by underestimating the power of talking to their stakeholders, said Nick Skinner, vice president of global data management for Northern Trust. “By getting them on board and understanding their requirements and vice versa, there is much more chance of both buy in and success of a project,” he said.
Skinner is in charge of the strategic direction of his bank’s data management efforts in the pricing and asset data space and is charged with campaigning to get funding from senior management for these projects. Northern Trust, which evolved from a pure custody bank into areas such as fund administration, has strategic legacy systems in place but these are not able to support all the newer aspects of the firm’s business, he explained. “Because we have diversified, we need to adapt our data model to support different end users, but we also need to retain the efficiencies of our current model,” he continued.
At HSBC, the front office is directly charged for the data management project costs, said Hinds. Although risk and compliance are stakeholders, data management is not a cost centre for them due to HSBC’s policy of not charging service functions for other service functions. “Communicating with the front office is easy for my team because of our dual role in the market data space as well as the reference data area,” she added.
The need for internal SLAs was, yet again, highlighted by all panellists in order to keep end users engaged and happy in the data management process.
Subscribe to our newsletter