The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Strategic Evolution of Reference Data Well Under Way, Data Management Attracting Board Level Attention

Share article

Proof of the long-predicted evolution of reference data management into a strategic concern for financial institutions comes this month in a new A-Team research paper commissioned by SunGard Data Management Solutions. The research, the result of a Buyer Persona Study carried out by A-Team Group during Spring 2007, finds that the strategic evolution of reference data is well under way under the jurisdiction of top managers. More than half of respondents to the survey – 51.6 per cent – said their data management strategies and systems purchases had been signed off by their CIOs, CTOs, COOs or CEOs. More than a quarter – 16.1 per cent – said their boards of directors had been involved in the approval process.

A-Team foresaw this change in the profile of data management as early as November 2004 when it concluded in an earlier study that the need for timely and accurate reference data was no longer being solely driven by the requirement to achieve operational efficiencies and STP. The “increasingly stringent regulatory environment is now elevating the issue of reference data management” to the top of senior managers’ agendas, A-Team wrote then, “driving through real organisational changes and resulting in a more strategic approach to managing data through the enterprise”. It added: “This could signal a paradigm shift where central reference data becomes a business in its own right for the larger firms, as well as a hub for growth, risk management, compliance and many other functions.”

There is ample evidence in this latest study of the accuracy of this prediction. A number of respondents – representatives of banks, broker/dealers and asset managers in the UK, Europe and North America – said their COOs had “laid down the law” on making reference data consistent across departments, to support proactive measures to better manage risk. One chief data officer at a global bank said: “We’ve spent the last six months eliminating departmental data management silos. Now that’s done, we can take a top-down, strategic look at exposure.”

Factors triggering C-level involvement in data management decision-making included spend, a cross-departmental approach, business development or acquisition, concern about regulatory compliance and proactive risk management. Analysis of the factors driving the re-evaluation of data management processes within firms showed that regulation is a strong driver – with 79.3 per cent of respondents ranking it in the top two pain categories and 96.5 per cent ranking it as middle or higher pain.

Reflecting the increasing strategic focus on data management, 87 per cent of those interviewed said they planned to extend the use of their data management platform: of the 13 per cent who said they did not, most had just completed major development projects and were satisfied with the results. Asked how spend on data management had changed during the past two years, a whopping 80 per cent said it had increased – another indicator of the strategic importance of data management projects. Fifteen per cent reported it had remained the same, with only five per cent saying it had decreased. The regulatory driver is evident again in the analysis of future spending plans related to data management: 77 per cent of respondents expect an increase in spending on data management to support regulatory requirements.

The study provides some interesting insights into the evolution of financial institutions’ attitudes towards making use of externally managed services in the data management area. Asked whether they would consider deploying an ASP or externally managed service to fulfil a range of data related functions, respondents indicated a clear willingness to do so for certain activities – 60 per cent said they were considering using one or the other for data acquisition, 57 per cent for mapping and consolidation and 66 per cent for data cleansing. By contrast, the majority of firms said the functions of data enrichment (61 per cent), handling permissions (75 per cent) and distributing data to downstream systems (85 per cent) would need to be maintained inhouse. Firms are considering externally managed services for less proprietary and value-added areas of activity.

Similar sentiments were revealed in respondents’ answers to the question, At what stage in your data management operation do you believe an external data management vendor can provide value? Eighty-three per cent suggested an external vendor could provide great or some value at the data acquisition stage; 87 per cent said vendors could provide value for mapping and consolidation and 90 per cent thought vendors could make a great or some level of valuable contribution for data cleansing. For data enrichment, while 64 per cent of respondents thought external data management vendors could provide value, 11 per cent thought not and 25 per cent said they would prefer to keep the activity inhouse. For handling permissions, 48 per cent thought vendors could add value, 21 per cent thought they could not and 31 per cent said it was a function to keep inhouse. Almost half said vendors could add value when it comes to distributing data to downstream systems, but 10 per cent said not, and 41 per cent said it was a function to be handled inhouse.

There is clearly a willingness among financial institutions to tap into external data management vendors where it palpably makes life easier. As one head of reference data at a large European bank commented on the prospect of data management vendors handling data source interface changes: “Oh good, that’s one thing I don’t have to worry about.”
For more information on this research, go to: www.a-teamgroup.com/research.

Related content

WEBINAR

Recorded Webinar: Data Standards – progress and case studies

Global data standards and identifiers are essential to business growth, market stability and cost reduction – but they can be challenging to implement, while a lack of consistency across jurisdictions has presented obstacles to global take-up. However, with regulators starting to sit up and take note, the issue of data standards is coming increasingly to...

BLOG

GoldenSource Ushers Reference and Pricing Data into the Front Office with Quant Workbench

Extracting value from data is a priority for financial institutions as the business looks to increase efficiency, reduce costs, identify new opportunities and gain competitive advantage. Some source in-house tools to improve the quality and accessibility of internal and external data, others look to third-parties for solutions. A new tool from GoldenSource, Quant Workbench, brings...

EVENT

Data Management Summit London

Now in its 11th year, the Data Management Summit (DMS) in London, will explore how financial institutions are adapting their data strategies to capitalise and support revenue generating activities and operational efficiency in today's digital and cloud based environment. Join us to hear from leading data practitioners and innovators who will share insights into how they are pushing the boundaries with data and delivering value through data and analytics.

GUIDE

Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...