About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

All change

Subscribe to our newsletter

There can be no doubt in anyone’s minds that the data management business is undergoing dramatic change. As indicated by the preliminary results from our reader poll this month, the majority of budgets for data management projects are facing cuts, be they significant or otherwise.

Surprisingly, a minority of you have witnessed an increase in these budgets – likely targeted at areas such as reducing counterparty or operational risks, if recent surveys are anything to go by. What these budgets have not done, however, is remain the same. But how are firms coping with this pressure to do more with less? Where are they spending their remaining pennies to stave off market risk in a volatile environment? According to the speakers at this month’s CorpActions 2009 Europe conference, spending on corporate actions projects is continuing, albeit at a slower pace. The risks introduced by errors in the manual processing of corporate actions are finally understood by senior management, after years of working at raising their profile, said JPMorgan’s David Kane. As with any corporate actions conference over the last few years, the problem of a lack of standardisation of practices amongst issuers was raised ad nauseaum during the event. “All roads lead back to the issuer,” said Kane, who described issuers as the bêtes noirs of the corporate actions world. The need for more consensus across the market with regards to the adoption of standards was also mentioned frequently by panellists and audience members alike (no surprises there then). However, what was new to this year’s event was the discussion surrounding the impact of heightened market volatility on the corporate actions process. Kane highlighted this “shark’s tooth” volatility as a key contributor to risk within the sector and stressed that this must be taken seriously by firms’ senior management. The risks posed are significant with regards to operational and reputational factors, after all. It seems that no matter where you turn in the data management market, the issue of risk is being bandied about with enthusiasm. The recent KPMG survey into the profile of risk within banks, however, gives a rather different perspective on the matter. The report indicates that despite the improvement in its profile, risk management is still struggling to gain influence at a strategic level within these institutions. The vast majority of respondents to the survey, at 76%, said that regardless of its raised profile, risk management is still stigmatised as a support function. This seems contradictory to the idea that the desire for better risk management is compelling these institutions to spend on data management. KPMG puts this view down to issues surrounding communication – which can be said to be at the root of the financial crisis itself. Perhaps KPMG’s report on the subject next year will tell a different story? After all, the issue of risk is frequently in the headlines and on the lips of the regulatory community, so how can banks choose to ignore risk management on a strategic level? The Bank for International Settlements (BIS), for example, has recently published its recommendations on risk supervision and stress testing, which highlights what it sees as significant inadequacies in current market practices. The weaknesses in stress testing revealed by the current financial turmoil are underlined by the report in four broad areas: use of stress testing and integration in risk governance; stress testing methodologies; scenario selection; and stress testing of specific risks and products. Risk management in the area of valuations has also been a hot topic within the regulatory community, as indicated by the recent President’s Working Group’s best practices documents. The finalised sets of best practices call for more robust valuations procedures, including a segregation of responsibilities, thorough written policies, oversight and a specific focus on hard to value assets.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

Nasdaq CSD Renews LEI Service Platform

Nasdaq CSD, a central securities depository providing access to the Estonian, Icelandic, Latvian and Lithuanian markets, and an accredited Legal Entity Identifier (LEI) issuer in the Baltic states and Nordics, has released an enhanced LEI service platform, Nasdaq LEI, that provides more straightforward and effective LEI code issuance and management to help firms meet regulatory...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Enterprise Data Management, 2009 Edition

This year has truly been a year of change for the data management community. Regulators and industry participants alike have been keenly focused on the importance of data with regards to compliance and risk management considerations. The UK Financial Services Authority’s fining of Barclays for transaction reporting failures as a result of inconsistent underlying reference...