The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Manual Processes Persist in Data Management for Nearly Half of Aim Software’s Survey Respondents

Nearly half (at 48%) of the respondents to Aim Software’s seventh annual reference data survey still use terminals for data look-up and manual processing, in spite of the leaps and bounds made in the development of sophisticated data management technology over the last few years. The survey, which was co-sponsored by SIX Telekurs, conducted between April and October this year and involved 371 financial institutions from 51 countries, highlights the fact that data management is still being held back by manual processes.

These results indicate that the industry’s level of data management maturity is perhaps not as far advanced as some would hope. The drivers for spending on data management technology, however, are fairly well defined, although they have changed over the last 12 months. Last year’s survey saw respondents highlight the high cost involved in the processing of reference data as the primary driver for spending on data management technology, but this year the focus is on a reduction in errors (76% of respondents).

The increased focus on error reduction (an increase of 23% on last year’s 53% of respondents) is perhaps indicative of the intense public and regulatory focus on data errors; just look at the fines meted out by the UK Financial Services Authority (FSA) for transaction reporting failures over the course of this year for proof. Firms are therefore keen to take away the risk, including reputational risk, involved in the management of their reference data. Given that 48% are still using manual processes (in comparison to last year’s 37% – perhaps more have come out of the woodwork), this focus on error reduction is only natural.

Obviously, cost cutting remains an important driver this year (at 66% of respondents, more than last year’s 58%), as does meeting risk management requirements (53%, down 5% from last year’s 58%) and compliance with regulatory requirements (38%, same as last year). Risk management may have scored higher last year, but error reduction is also about taking risk out of the equation and better meeting regulatory reporting requirements by having more faith in the underlying data. Perhaps this is the easiest way to prove the ROI of a data management project in such cost conscious times?

It seems that such an argument may be proving successful, given that 49% of respondents indicated that they would be investing in the automation of their static data. Furthermore, for the second year in a row, the area of corporate actions has been highlighted as an important: with 37% of respondents planning to invest in automation of this particularly thorny area of reference data.

In terms of the priority breakdowns overall for reference data, the survey indicates: “The management of basic data (71%), the management of price data (65%) as well as the opening of instruments (59%) were named to be the major objectives of reference data management, followed by the management of corporate actions (56%) and the reconciliation of several data sources (49%).”

Unsurprisingly, the main challenge related to reference data management, cited by the majority of respondents, was high cost. This has risen in importance over the last couple of years – just compare this year’s 56% of respondents that cited it as a problem to last year’s 49%. This can also be borne out by the lobbying activity going on across the industry to bring down the costs of data vendor feeds. Other issues cited included bad data coverage (at 44%), delays in data delivery and missing standards.

On the subject of standardisation, very little seems to have changed from last year. The penetration of ISO 15022 in the market remains disappointing, with only 17% of respondents using the messaging format. However, there has been some take up of its successor, ISO 20022, with 8% of respondents indicating that they are using the standard even though it is still early days.

The golden copy concept, however, has witnessed much more success in the market: “Whereas in 2007 only 38% of all respondents stated that they had a golden copy, in 2010 already 52% feed reference data into a centrally managed database. Despite the cost containment policy in many enterprises following the economic crisis, firms are aware of their need of high quality reference data to enhance operational efficiency and help support their growing risk management and compliance requirements. The use of central master files is especially high in North America (76%).” To put this in context though, this overall increase on last year’s figure for golden copy implementations globally is only 1% on 2009’s 51%. Things might be progressing, but it’s happening slowly.

Last year, the regulatory driver du jour was Basel II (with 49% of the vote), closely followed by MiFID (42%). This year, MiFID has overtaken Basel (50% versus 46%) as the regulation on reference data managers’ minds. This is likely a combination of the press surrounding transaction reporting fines and the advent of the MiFID review and the spectre of a sequel on the horizon.

Related content

WEBINAR

Upcoming Webinar: Data management for ESG requirements

Date: 13 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Environmental, Social and Governance (ESG) investing is moving into the mainstream, requiring asset managers to develop ESG strategies that deliver for both the firm and its investors. While these strategies can outperform those that do not include ESG factors,...

BLOG

Bottomline Boosts Anti-Financial Crime Capabilities with Dow Jones Data

Bottomline Technologies, a FinTech specialising in secure business payments, this week announced a partnership with Dow Jones Risk & Compliance: adding key Dow Jones data to its Anti-Money Laundering and Counter Terrorist Financing monitoring and screening capabilities. Dow Jones’s risk data, including politically exposed persons (PEPs), sanctions lists and adverse media entities for the UK,...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...