Reference data management is no longer at the fringes of an organisation in terms of the perception of its strategic importance by senior management, said Sean Taylor, director at Deutsche Wealth Management, to the delegation at Marcus Evans’ recent Reference Data conference on 12 February.
Taylor, who was chairing the event, indicated that increased regulatory scrutiny over the next 12 months would increase data management’s importance to firms even further.
“Data needs to be of a standard that doesn’t allow deep holes to be dug,” explained Taylor. “There will be winds of change throughout the financial markets over the next 12 months as the regulators begin turning over stones to find out what happened, why and how.”
He predicted that the US will see a regulatory crackdown on financial data by the end of the first quarter of 2009 and there will be a real requirement for golden copy as a result.
“It will no longer be acceptable to have silos where the right hand does not know what the left is doing,” he added. “But data management will still have to fight to keep its share of the IT budget.”
Taylor warned that there will be “fighting at the money pit” for data management project funding this year and suggested that delegates learn from each others’ experiences to help them in their quest.
This perception of the increasing importance of data management to senior management was echoed throughout the conference. Claus Thorball, head of global market data at Saxo Bank, discussed his practical experience of achieving what he called “hands on” data quality and suggested that as long as the business case is sound, management will listen.
Thorball said that although the financial crisis has placed emphasis on getting data right, it has also put a lot of pressure on institutions to drastically cut overheads. “There is not a lot of fat on budgets any more and one of the main KPIs for projects seems to be to keep it on budget,” he said. “However, the crisis has also afforded us a unique opportunity to be able to review our data providers and re-evaluate and renegotiate the services they provide.”
He warned delegates not to try to “sell” the benefits of increasing data quality to their senior management; “money talks” he recommended. In order to get backing in such a climate, project teams must explain the potential savings and the potential earnings offered by the improvement in data quality. “You need to decide on the baseline for the project before you begin it and document the creation of value at every step of the process. In order to be successful, you need the backing of the business and to engage the stakeholders in the process,” said Thorball.
The measurements to highlight are in three key areas, according to Thorball: cost reduction, revenue growth and risk reduction. He also recommended providing clarity around the governance of the project with regards to the individual stages, including implementation and maintenance.
To this end, Saxo Bank engaged in a review of its providers with regards to market data and decided to look at “hands on quality assessment tools” as the basis for renegotiation. The bank was using three main vendors for its exchange pricing data and following a review of their provision, decided to denote one as a primary provider, another as a secondary provider and the last as a back up.
“With the conscious choice of a primary provider, we got a better level of service from them in terms of quality, service level agreements (SLAs) and a significant level of cost reduction. The secondary provider is also improving its service because it wants to become our primary provider,” he explained.
Deutsche’s Taylor agreed that there has been a “flight to quality” with regards to the adoption of a qualitative approach to data within the industry. He recommended that delegates heed the lessons learnt by Saxo in its renegotiation of vendor contracts in a general sense in order to get more out of them for less.
Maurizio Garro, head of the group pricing division of UniCredit Group, explained his bank’s experience of normalising group wide pricing practices. He indicated that Basel II and International Accounting Standards (IAS) have had a significant impact on the pricing requirements of financial institutions. “IAS has meant we need sophisticated estimations and a level of transparency around these quickly,” he said.
UniCredit had a significant challenge to deal with the many inputs from various pricing sources and took the decision to create a central library for this data. “The Bulldozer system was the end result of four years of work and it provides all end users within the bank with the opportunity to simulate prices,” Garro explained. “The main challenge was to deliver all these outputs in the time required.”
Timeliness has become a key factor in data management as the once distinct lines between market data and static reference data become blurred. This was another recurring theme throughout the speaker presentations, as speakers described the new pressures on their data management departments. Being able to pinpoint data inaccuracies and solve them in real time is becoming a priority in such risk averse times, speakers indicated.
Chris Johnson, head of data management for Institutional Fund Services Europe at HSBC Securities Services, gave the third party administrator viewpoint on the data management challenge. He agreed that since the fall of Lehman last year, pricing has become a key challenge for the industry. “Pricing has become my life,” he joked.
However, rather than focus on the issues surrounding pricing alone, Johnson discussed the challenges of practical implementation of STP within the sphere of reference data. He described the complexities surrounding the various segments of the trade lifecycle and highlighted the potential cost and risk involved in data bottlenecks.
He identified the riskiest area with regards to data inaccuracies as the execution cycle, which requires an “extensive amount of data” and when things go wrong it becomes “very expensive”. Issues such as incorrect instrument data, the identification of the wrong settlement location, incorrectly structured funds, inaccuracies in FX data and incorrect calendars are just a few of the horrors that are awaiting trade data at this point, he explained.
Johnson’s main recommendation to delegates was to explain to their downstream users the perils of changing the data. “The perception within firms is that it is easier to change the data than the processes but if you do this, it will result in a very twisted securities master file. Tinker with the golden copy at your peril,” he warned.
Overall, speakers were positive about the future of data management in the current environment and had practical advice for those just now beginning to dip a toe into the data management project pool. The main advice centred around improving communication with both senior management and downstream users about the real, tangible benefits of these projects right from the start.
Subscribe to our newsletter