About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Manual Processes Persist in Data Management for Nearly Half of Aim Software’s Survey Respondents

Subscribe to our newsletter

Nearly half (at 48%) of the respondents to Aim Software’s seventh annual reference data survey still use terminals for data look-up and manual processing, in spite of the leaps and bounds made in the development of sophisticated data management technology over the last few years. The survey, which was co-sponsored by SIX Telekurs, conducted between April and October this year and involved 371 financial institutions from 51 countries, highlights the fact that data management is still being held back by manual processes.

These results indicate that the industry’s level of data management maturity is perhaps not as far advanced as some would hope. The drivers for spending on data management technology, however, are fairly well defined, although they have changed over the last 12 months. Last year’s survey saw respondents highlight the high cost involved in the processing of reference data as the primary driver for spending on data management technology, but this year the focus is on a reduction in errors (76% of respondents).

The increased focus on error reduction (an increase of 23% on last year’s 53% of respondents) is perhaps indicative of the intense public and regulatory focus on data errors; just look at the fines meted out by the UK Financial Services Authority (FSA) for transaction reporting failures over the course of this year for proof. Firms are therefore keen to take away the risk, including reputational risk, involved in the management of their reference data. Given that 48% are still using manual processes (in comparison to last year’s 37% – perhaps more have come out of the woodwork), this focus on error reduction is only natural.

Obviously, cost cutting remains an important driver this year (at 66% of respondents, more than last year’s 58%), as does meeting risk management requirements (53%, down 5% from last year’s 58%) and compliance with regulatory requirements (38%, same as last year). Risk management may have scored higher last year, but error reduction is also about taking risk out of the equation and better meeting regulatory reporting requirements by having more faith in the underlying data. Perhaps this is the easiest way to prove the ROI of a data management project in such cost conscious times?

It seems that such an argument may be proving successful, given that 49% of respondents indicated that they would be investing in the automation of their static data. Furthermore, for the second year in a row, the area of corporate actions has been highlighted as an important: with 37% of respondents planning to invest in automation of this particularly thorny area of reference data.

In terms of the priority breakdowns overall for reference data, the survey indicates: “The management of basic data (71%), the management of price data (65%) as well as the opening of instruments (59%) were named to be the major objectives of reference data management, followed by the management of corporate actions (56%) and the reconciliation of several data sources (49%).”

Unsurprisingly, the main challenge related to reference data management, cited by the majority of respondents, was high cost. This has risen in importance over the last couple of years – just compare this year’s 56% of respondents that cited it as a problem to last year’s 49%. This can also be borne out by the lobbying activity going on across the industry to bring down the costs of data vendor feeds. Other issues cited included bad data coverage (at 44%), delays in data delivery and missing standards.

On the subject of standardisation, very little seems to have changed from last year. The penetration of ISO 15022 in the market remains disappointing, with only 17% of respondents using the messaging format. However, there has been some take up of its successor, ISO 20022, with 8% of respondents indicating that they are using the standard even though it is still early days.

The golden copy concept, however, has witnessed much more success in the market: “Whereas in 2007 only 38% of all respondents stated that they had a golden copy, in 2010 already 52% feed reference data into a centrally managed database. Despite the cost containment policy in many enterprises following the economic crisis, firms are aware of their need of high quality reference data to enhance operational efficiency and help support their growing risk management and compliance requirements. The use of central master files is especially high in North America (76%).” To put this in context though, this overall increase on last year’s figure for golden copy implementations globally is only 1% on 2009’s 51%. Things might be progressing, but it’s happening slowly.

Last year, the regulatory driver du jour was Basel II (with 49% of the vote), closely followed by MiFID (42%). This year, MiFID has overtaken Basel (50% versus 46%) as the regulation on reference data managers’ minds. This is likely a combination of the press surrounding transaction reporting fines and the advent of the MiFID review and the spectre of a sequel on the horizon.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to organise, integrate and structure data for successful AI

25 September 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are...

BLOG

A-Team Group Announces Winners of RegTech Insight Awards Europe 2025

A-Team Group has announced the winners of its RegTech Insight Awards Europe 2025. The awards recognise both established providers and innovative newcomers providing RegTech solutions to capital market participants that significantly improve their ability to respond effectively to evolving and ever more complex regulatory requirements. This year’s RegTech Insight Awards Europe included more than 40...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...