In order to ensure that downstream departments remain in synch with central data management projects, firms must adopt an incremental and practical approach to data integration, agreed FIMA’s panel on the practicalities of managing a golden copy repository. Peter Giordano, executive director of institutional equities at Oppenheimer & Co, recommended that firms roll out the projects piece by piece rather than take a big bang approach to data management.
Chris Johnson, head of data management, Institutional Fund Services, Europe, for HSBC Securities Services, told delegates to focus on meeting the needs of “downstream consumers” by getting users around the table to discuss their data needs. “You also need to allow for differences across client groups,” he said.
Julia Sutton, global head of customer accounts operations at Citi, agreed that communication with user groups is key: “You need to let them vent and tell you the problems they are experiencing as well as what they want from their data in the future.”
Giordano also contended that time should be well spent before a project is launched to think of all the possible future requirements of the data system from user groups. He warned delegates to be aware that future data requirements may be changed by events such as mergers and acquisitions.
HSBC’s Johnson said that standardisation can only go so far as there remain complex downstream issues that cannot be standardised. “You should attempt to tackle as much as you can but understand that this cannot be done for every single area,” he said.
Sutton added that the data management endeavour is similar to “constant firefighting” and teams need to “keep digging away at it” to succeed.
Subscribe to our newsletter