The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

All change

Share article

There can be no doubt in anyone’s minds that the data management business is undergoing dramatic change. As indicated by the preliminary results from our reader poll this month, the majority of budgets for data management projects are facing cuts, be they significant or otherwise.

Surprisingly, a minority of you have witnessed an increase in these budgets – likely targeted at areas such as reducing counterparty or operational risks, if recent surveys are anything to go by. What these budgets have not done, however, is remain the same. But how are firms coping with this pressure to do more with less? Where are they spending their remaining pennies to stave off market risk in a volatile environment? According to the speakers at this month’s CorpActions 2009 Europe conference, spending on corporate actions projects is continuing, albeit at a slower pace. The risks introduced by errors in the manual processing of corporate actions are finally understood by senior management, after years of working at raising their profile, said JPMorgan’s David Kane. As with any corporate actions conference over the last few years, the problem of a lack of standardisation of practices amongst issuers was raised ad nauseaum during the event. “All roads lead back to the issuer,” said Kane, who described issuers as the bêtes noirs of the corporate actions world. The need for more consensus across the market with regards to the adoption of standards was also mentioned frequently by panellists and audience members alike (no surprises there then). However, what was new to this year’s event was the discussion surrounding the impact of heightened market volatility on the corporate actions process. Kane highlighted this “shark’s tooth” volatility as a key contributor to risk within the sector and stressed that this must be taken seriously by firms’ senior management. The risks posed are significant with regards to operational and reputational factors, after all. It seems that no matter where you turn in the data management market, the issue of risk is being bandied about with enthusiasm. The recent KPMG survey into the profile of risk within banks, however, gives a rather different perspective on the matter. The report indicates that despite the improvement in its profile, risk management is still struggling to gain influence at a strategic level within these institutions. The vast majority of respondents to the survey, at 76%, said that regardless of its raised profile, risk management is still stigmatised as a support function. This seems contradictory to the idea that the desire for better risk management is compelling these institutions to spend on data management. KPMG puts this view down to issues surrounding communication – which can be said to be at the root of the financial crisis itself. Perhaps KPMG’s report on the subject next year will tell a different story? After all, the issue of risk is frequently in the headlines and on the lips of the regulatory community, so how can banks choose to ignore risk management on a strategic level? The Bank for International Settlements (BIS), for example, has recently published its recommendations on risk supervision and stress testing, which highlights what it sees as significant inadequacies in current market practices. The weaknesses in stress testing revealed by the current financial turmoil are underlined by the report in four broad areas: use of stress testing and integration in risk governance; stress testing methodologies; scenario selection; and stress testing of specific risks and products. Risk management in the area of valuations has also been a hot topic within the regulatory community, as indicated by the recent President’s Working Group’s best practices documents. The finalised sets of best practices call for more robust valuations procedures, including a segregation of responsibilities, thorough written policies, oversight and a specific focus on hard to value assets.

Related content

WEBINAR

Recorded Webinar: How far should counterparty screening go? Balancing the ideal and the realistic

Counterparty screening is a regulatory requirement, but do you know enough about your clients’ clients, and beyond? How can you source this information and how does it benefit your business? How far do you need to dig into entity ownership structures? This webinar discusses these challenges and how they relate to your organisation, whether you’re...

BLOG

Global LEI Foundation Aims to Accelerate LEI Adoption by Slashing Prices of the Identifiers to Single Digit Dollars

The Global LEI Foundation (GLEIF) is planning to tear down the cost barrier obstructing widespread adoption of the LEI with the implementation of a validation agent role for banks and financial institutions. Expectations are that frequently contended high prices charged to register an entity for an LEI – by way of example Bloomberg charges $65...

EVENT

RegTech Summit Virtual

Regtech Summit Virtual will explore how business and operating models have adapted post COVID and how RegTech can provide agile and enhanced compliance for managing an evolving risk and compliance landscape. As the dust settles, we will look at the outlook for the global RegTech industry, where Regulators are focusing as they get back to business, and deep dive into global regulatory priorities for the rest of the year and into 2021.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...