About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Risk Management and Error Reduction are Key Drivers for Investment in Data Management, Says Aim and Interactive Data

Subscribe to our newsletter

Improving reference data quality in order to reduce errors is the primary reason why financial institutions have recently been engaging in reference data management projects, according to the results of Aim Software’s annual survey for 2008. Of the 340 respondents to the survey, 70% highlighted cited data quality as a key focus for these projects, with reducing costs a close second, at 53%.

Bob Cumberbatch, business lines director at Interactive Data (Europe), which sponsored the Aim survey, reckons this desire for a reduction in errors and improvements in risk management will continue to be drivers in the current market climate. “The survey shows that financial institutions around the world are making efforts to improve and automate their reference data management. 42% of firms currently plan to invest in the automation of static data and nearly 30% of respondents intend to invest in the automation of pricing data (28%) and corporate actions (28%),” he elaborates.

“In only 55% of all cases is financial information already fed directly into the core banking application. The trend towards increased automation is reflected by a growing number of firms that are using golden copy management for managing reference data (43%),” Cumberbatch adds.

Martin Buchberger, head of marketing and sales at Aim, agrees that the strong trend towards adopting a golden copy approach across the various regions is something that will continue in the future. “In 2007 only 38% stated to have a centrally managed database but in 2008 this figure went up to 43%. It seems that financial institutions have recognised the advantages of a golden copy solution to manage their financial data. Especially now in times of the credit crunch when companies have to optimise their costs and infrastructure, they will have to rethink their long term data management strategy and enhance their data management in order to make it future proof.”

However, Buchberger found it surprising that so many respondents are still struggling with poor data quality, missing standards and bad data coverage. “The costs of data from feeds are not such an issue here. The fact that the study participants last year gave the same answers shows that there needs to be something done about these issue in order to improve the usability of financial data for banks and financial institutions,” he says.

Moreover, at 43%, the number using proprietary models still outweighs the number of businesses focusing on ISO 15022, although the latter is still the most widely used standard, says Cumberbatch. However, only 22% currently use ISO 15022, and only 5% use MDDL. 32% of those firms responding said they had no data model in use at all. This suggests that standards adoption by firms may be an obstacle, he suggests.

“The survey shows that financial institutions are tending to reduce human resources for data management and are undertaking increased efforts to minimise costs in this area. On average, the majority of the firms interviewed employ only between one and five people concerned with data management (in 33% of respondents’ departments). In 18% of the cases, six to 10 employees are involved, while only 12% of the respondents employ more than 21 people in their data management departments. The results indicate that, in general, efforts are being made to reduce the number of employees in data management,” says Cumberbatch.

This cost cutting approach may impact on the technology spend of these institutions as a result of market conditions, but this may have to wait until next year to be clarified. As Aim received most of the completed surveys in September, the results may be rather more optimistic than if the survey had been completed a few months later.

“When the economic crisis hit financial institutions all over the world in autumn, companies certainly focused on solving their most immediate problems and started to plan a mid and long term strategy afterwards. That means that the questionnaires we obtained after September did not yet reflect the current economic crisis and give a more telling outlook on the future. Nevertheless, we do expect to get very interesting answers in our survey next year. By then companies might have rethought their overall future data management strategy,” Buchberger explains.

Cumberbatch agrees that next year’s results may demonstrate this turnaround: “There is an increase in firms’ desire to reduce costs (50% to 53% in 2008). I believe that risk management, regulation and compliance – as well as operational risk reduction – will continue to compete against internal projects for expenditure.”

He expects that risk management, regulation, compliance and operational risk reduction will continue to dominate the landscape in 2009. “The survey shows that key issues like Basel II, MiFID, higher quality data, outsourcing and the integration of specialised STP solutions are ranked high on the agenda of financial institutions and will probably continue to shape IT investments over the next few years,” he continues.

With regards to regulation, 60% of the 340 survey respondents confirmed that Basel II had a substantial influence on their IT investments. MiFID also had a very high recognition in Western Europe (49%), while 23% of the respondents considered the US-based Sarbanes-Oxley Act (SOX) as a driver for enhancing the degree of automation of their reference data management. UCITS III motivated 13% of the interviewees to invest in their back office environment.

Globally, the survey shows that 42% of firms are currently planning to increase the degree of automation for static data; 29% consider increasing the level of automation for pricing information, while another 28% plan to enhance the automation of corporate actions. Asked if they would increase the level of automation in reference data management, about 40% of firms responded negatively.

Buchberger reckons banks will have to optimise their overall data management in order to improve efficiency and reduce costs over the course of this year. “The findings from the study show that many financial institutions still have to catch up here,” he adds.

Cumberbatch, on the other hand, thinks that a variety of factors, including the continued volatility of the financial markets and the heightened regulatory environment, will drive demand in 2009 for a broad range of financial data, especially in the areas of valuation, reference data and low latency data. “There is clearly a higher degree of cost awareness for our clients in the current environment; however, with risk management and compliance becoming growing priorities, I believe that clients will not compromise in their need for essential data. Quality is important to our clients and I don’t see that changing,” he elaborates.

The vendor community will have to work hard to meet customers’ requirements in such a tough economic climate but there is still an appetite for data management, agree Cumberbatch and Buchberger. “We do see that many banks have recognised flaws in their data management systems and consider making specific improvements over the next few years. That means that vendors will be challenged to provide suitable and customised solutions,” concludes Buchberger.

The survey, which was conducted by Aim between May to December last year and involved 340 responses from financial institutions in 58 countries, can be downloaded from Aim’s website – www.aimsoftware.com.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The roles of cloud and managed services in optimising enterprise data management

Date: 14 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality...

BLOG

Global Screening Services Raises $47 Million, Moves into Operational Phase

Global Screening Services (GSS), a provider of transaction screening, has completed a Series A2 funding round raising over $47 million (£37 million). The investment will support the London-based company’s transition from a development to operational phase as its cloud-native platform prepares to go live with inaugural clients. Originally incubated by AlixPartners, GSS tackles the complexity...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...