The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Manual Processes Persist in Data Management for Nearly Half of Aim Software’s Survey Respondents

Nearly half (at 48%) of the respondents to Aim Software’s seventh annual reference data survey still use terminals for data look-up and manual processing, in spite of the leaps and bounds made in the development of sophisticated data management technology over the last few years. The survey, which was co-sponsored by SIX Telekurs, conducted between April and October this year and involved 371 financial institutions from 51 countries, highlights the fact that data management is still being held back by manual processes.

These results indicate that the industry’s level of data management maturity is perhaps not as far advanced as some would hope. The drivers for spending on data management technology, however, are fairly well defined, although they have changed over the last 12 months. Last year’s survey saw respondents highlight the high cost involved in the processing of reference data as the primary driver for spending on data management technology, but this year the focus is on a reduction in errors (76% of respondents).

The increased focus on error reduction (an increase of 23% on last year’s 53% of respondents) is perhaps indicative of the intense public and regulatory focus on data errors; just look at the fines meted out by the UK Financial Services Authority (FSA) for transaction reporting failures over the course of this year for proof. Firms are therefore keen to take away the risk, including reputational risk, involved in the management of their reference data. Given that 48% are still using manual processes (in comparison to last year’s 37% – perhaps more have come out of the woodwork), this focus on error reduction is only natural.

Obviously, cost cutting remains an important driver this year (at 66% of respondents, more than last year’s 58%), as does meeting risk management requirements (53%, down 5% from last year’s 58%) and compliance with regulatory requirements (38%, same as last year). Risk management may have scored higher last year, but error reduction is also about taking risk out of the equation and better meeting regulatory reporting requirements by having more faith in the underlying data. Perhaps this is the easiest way to prove the ROI of a data management project in such cost conscious times?

It seems that such an argument may be proving successful, given that 49% of respondents indicated that they would be investing in the automation of their static data. Furthermore, for the second year in a row, the area of corporate actions has been highlighted as an important: with 37% of respondents planning to invest in automation of this particularly thorny area of reference data.

In terms of the priority breakdowns overall for reference data, the survey indicates: “The management of basic data (71%), the management of price data (65%) as well as the opening of instruments (59%) were named to be the major objectives of reference data management, followed by the management of corporate actions (56%) and the reconciliation of several data sources (49%).”

Unsurprisingly, the main challenge related to reference data management, cited by the majority of respondents, was high cost. This has risen in importance over the last couple of years – just compare this year’s 56% of respondents that cited it as a problem to last year’s 49%. This can also be borne out by the lobbying activity going on across the industry to bring down the costs of data vendor feeds. Other issues cited included bad data coverage (at 44%), delays in data delivery and missing standards.

On the subject of standardisation, very little seems to have changed from last year. The penetration of ISO 15022 in the market remains disappointing, with only 17% of respondents using the messaging format. However, there has been some take up of its successor, ISO 20022, with 8% of respondents indicating that they are using the standard even though it is still early days.

The golden copy concept, however, has witnessed much more success in the market: “Whereas in 2007 only 38% of all respondents stated that they had a golden copy, in 2010 already 52% feed reference data into a centrally managed database. Despite the cost containment policy in many enterprises following the economic crisis, firms are aware of their need of high quality reference data to enhance operational efficiency and help support their growing risk management and compliance requirements. The use of central master files is especially high in North America (76%).” To put this in context though, this overall increase on last year’s figure for golden copy implementations globally is only 1% on 2009’s 51%. Things might be progressing, but it’s happening slowly.

Last year, the regulatory driver du jour was Basel II (with 49% of the vote), closely followed by MiFID (42%). This year, MiFID has overtaken Basel (50% versus 46%) as the regulation on reference data managers’ minds. This is likely a combination of the press surrounding transaction reporting fines and the advent of the MiFID review and the spectre of a sequel on the horizon.

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

Corporate Actions a Challenge? Join Next Week’s A-Team Group Webinar to Find Solutions

Demand for timely and accurate corporate actions data is increasing as the complexity and volumes of events continues to rise, and financial institutions acknowledge the costly gap between accurate corporate actions processing in real, or near-real, time and faulty processing caused by poor data and resulting in missed opportunities to optimise revenue. While many firms...

EVENT

TradingTech Summit Virtual (Redirected)

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...