The Case for a Reference Data Management Utility
Money, time, accuracy and quality have all been casualties of the extensive duplication of effort that has characterised the management of reference data management in the financial industry for as long as anyone can remember. The issue has become all the more acute in recent years by the growing complexity of financial markets and the introduction of swathes of sometimes overlapping regulations.
A survey of senior data managers, data technologists and chief data officers at Tier 1 and Tier 2 investment banks, however, seems to indicate that the passive acceptance of this burden has now reached an important turning point. The survey – conducted by A-Team Group on behalf of Euroclear and SmartStream in the first quarter of 2014 – suggests that key industry players (both collectively and individually) have started to develop new business models that will be better placed to share the reference data management function for more efficient and cost effective operations.
A-Team Group’s survey attempted to gauge the industry’s appetite for a ‘utility’ approach to managing reference data. The findings suggest that not only is there overwhelming interest in moving in this direction, but that some significant initiatives are already under way. If successful, these could slash the time and cost currently being expended on these activities, but also could seriously improve the accuracy and quality of the data consumed through much more sophisticated automation and process management.