About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Manual Processes Persist in Data Management for Nearly Half of Aim Software’s Survey Respondents

Subscribe to our newsletter

Nearly half (at 48%) of the respondents to Aim Software’s seventh annual reference data survey still use terminals for data look-up and manual processing, in spite of the leaps and bounds made in the development of sophisticated data management technology over the last few years. The survey, which was co-sponsored by SIX Telekurs, conducted between April and October this year and involved 371 financial institutions from 51 countries, highlights the fact that data management is still being held back by manual processes.

These results indicate that the industry’s level of data management maturity is perhaps not as far advanced as some would hope. The drivers for spending on data management technology, however, are fairly well defined, although they have changed over the last 12 months. Last year’s survey saw respondents highlight the high cost involved in the processing of reference data as the primary driver for spending on data management technology, but this year the focus is on a reduction in errors (76% of respondents).

The increased focus on error reduction (an increase of 23% on last year’s 53% of respondents) is perhaps indicative of the intense public and regulatory focus on data errors; just look at the fines meted out by the UK Financial Services Authority (FSA) for transaction reporting failures over the course of this year for proof. Firms are therefore keen to take away the risk, including reputational risk, involved in the management of their reference data. Given that 48% are still using manual processes (in comparison to last year’s 37% – perhaps more have come out of the woodwork), this focus on error reduction is only natural.

Obviously, cost cutting remains an important driver this year (at 66% of respondents, more than last year’s 58%), as does meeting risk management requirements (53%, down 5% from last year’s 58%) and compliance with regulatory requirements (38%, same as last year). Risk management may have scored higher last year, but error reduction is also about taking risk out of the equation and better meeting regulatory reporting requirements by having more faith in the underlying data. Perhaps this is the easiest way to prove the ROI of a data management project in such cost conscious times?

It seems that such an argument may be proving successful, given that 49% of respondents indicated that they would be investing in the automation of their static data. Furthermore, for the second year in a row, the area of corporate actions has been highlighted as an important: with 37% of respondents planning to invest in automation of this particularly thorny area of reference data.

In terms of the priority breakdowns overall for reference data, the survey indicates: “The management of basic data (71%), the management of price data (65%) as well as the opening of instruments (59%) were named to be the major objectives of reference data management, followed by the management of corporate actions (56%) and the reconciliation of several data sources (49%).”

Unsurprisingly, the main challenge related to reference data management, cited by the majority of respondents, was high cost. This has risen in importance over the last couple of years – just compare this year’s 56% of respondents that cited it as a problem to last year’s 49%. This can also be borne out by the lobbying activity going on across the industry to bring down the costs of data vendor feeds. Other issues cited included bad data coverage (at 44%), delays in data delivery and missing standards.

On the subject of standardisation, very little seems to have changed from last year. The penetration of ISO 15022 in the market remains disappointing, with only 17% of respondents using the messaging format. However, there has been some take up of its successor, ISO 20022, with 8% of respondents indicating that they are using the standard even though it is still early days.

The golden copy concept, however, has witnessed much more success in the market: “Whereas in 2007 only 38% of all respondents stated that they had a golden copy, in 2010 already 52% feed reference data into a centrally managed database. Despite the cost containment policy in many enterprises following the economic crisis, firms are aware of their need of high quality reference data to enhance operational efficiency and help support their growing risk management and compliance requirements. The use of central master files is especially high in North America (76%).” To put this in context though, this overall increase on last year’s figure for golden copy implementations globally is only 1% on 2009’s 51%. Things might be progressing, but it’s happening slowly.

Last year, the regulatory driver du jour was Basel II (with 49% of the vote), closely followed by MiFID (42%). This year, MiFID has overtaken Basel (50% versus 46%) as the regulation on reference data managers’ minds. This is likely a combination of the press surrounding transaction reporting fines and the advent of the MiFID review and the spectre of a sequel on the horizon.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Market Abuse Regulation: How to Detect and Analyse Abusive Market Transactions

Date: 28 March 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Market abuse is a potential problem for many financial institutions. It typically covers insider dealing, unlawful disclosure of inside information, and market manipulation, and is often difficult to detect and report in line with regulations such as Market Abuse...

BLOG

Ataccama Upgrades Data Catalog, Adds Data Observability, Offers Data Quality for Snowflake in Latest Release of Ataccama ONE Platform

Ataccama, provider of a unified data management platform, has released early access to version 14 of its Ataccama ONE platform. At the heart of the release is an upgraded data catalog with greater capabilities for collaboration, compliance and data activation, as well as a new data observability module built for enterprise use. The release also...

EVENT

A-Team Briefing: Cloud Innovation for Data Ops

This Innovation Briefing will explore approaches to data infrastructure transformation, technologies required and how to make sure processes are optimised to support real time data management. Hear from leading practitioners and innovative technology solution providers who will share insight into how to set up and leverage your data infrastructure to provide user access to consistent data and analytics, and companies the ability to monetise their data.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...