Unusually for a financial services conference, this year’s FIMA focused on the negative aspects of capitalism; the topic of the seven deadly sins kept cropping up throughout the sessions – including greed, sloth and gluttony (well, at the buffet table at the very least…).
However, these aspects of the industry were not seen in a wholly negative light, from the data management budgeting perspective at any rate. For example, Keith Hale, executive vice president and co-founder of Netik, opened the second day of FIMA 2008 by quoting Warren Buffet and highlighting that greed and fear have meant a rise in the priority of data to financial institutions. “Financial markets are either driven by greed or fear, according to Warren Buffet, and it seems that recent times have flipped the driver from greed to fear. This fear is good for data management projects as they are more likely to get funding due to regulatory pressure,” Hale told delegates.
In the current economic climate, the industry will witness more of a focus on a reduction in operational costs, a greater focus on automation and on getting reference data right, Hale continued. “Risk management and regulatory requirements are adding pressure on institutions to deal with their data issues. Market consolidation at the pace it is happening today is also a contributing factor as firms need to reduce duplication of processes to reduce costs,” he explained. Greed and sloth cropped up the next day during Graeme Austin, product management director at Xtrakter’s session.
He said these two deadly sins are what have got the industry into the position it is today and said that as a result, firms are looking for a more integrated approach to market and reference data to get a better handle on risk. The only way to tackle this risk and fear is to achieve a version of “truth” via better data management, Peter Serenita, chief data officer at JPMorgan Securities Services, told delegates. Serenita philosophised about the meaning of the word ‘truth’ in a data management context and came to the conclusion that although it is not realistic to hope for 100% accuracy of data, 100% of the time, this should not stop data management teams from striving to achieve a high level of quality data across an institution.
The success of a data management project should not be measured on achieving an idealistic version of the truth, it should be focused on data integration and downstream impacts, he explained. This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.
JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”
It seems that Reference Data Review readers are also quite keen on the centralisation of reference data, but on a wider scale. According to our recent poll, 74% of readers are keen for the introduction of a centralised utility for reference data, although only 22% think this can be achieved next year. This should be good news for DClear, the wholly owned subsidiary of the investment arm of the Dubai International Financial Centre (DIFC), which released plans for its reference data utility in May this year. We’ll certainly be monitoring its progress over 2009 very carefully to see whether the 22% are correct in their assumptions.
Subscribe to our newsletter