About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

All change

Subscribe to our newsletter

There can be no doubt in anyone’s minds that the data management business is undergoing dramatic change. As indicated by the preliminary results from our reader poll this month, the majority of budgets for data management projects are facing cuts, be they significant or otherwise.

Surprisingly, a minority of you have witnessed an increase in these budgets – likely targeted at areas such as reducing counterparty or operational risks, if recent surveys are anything to go by. What these budgets have not done, however, is remain the same. But how are firms coping with this pressure to do more with less? Where are they spending their remaining pennies to stave off market risk in a volatile environment? According to the speakers at this month’s CorpActions 2009 Europe conference, spending on corporate actions projects is continuing, albeit at a slower pace. The risks introduced by errors in the manual processing of corporate actions are finally understood by senior management, after years of working at raising their profile, said JPMorgan’s David Kane. As with any corporate actions conference over the last few years, the problem of a lack of standardisation of practices amongst issuers was raised ad nauseaum during the event. “All roads lead back to the issuer,” said Kane, who described issuers as the bêtes noirs of the corporate actions world. The need for more consensus across the market with regards to the adoption of standards was also mentioned frequently by panellists and audience members alike (no surprises there then). However, what was new to this year’s event was the discussion surrounding the impact of heightened market volatility on the corporate actions process. Kane highlighted this “shark’s tooth” volatility as a key contributor to risk within the sector and stressed that this must be taken seriously by firms’ senior management. The risks posed are significant with regards to operational and reputational factors, after all. It seems that no matter where you turn in the data management market, the issue of risk is being bandied about with enthusiasm. The recent KPMG survey into the profile of risk within banks, however, gives a rather different perspective on the matter. The report indicates that despite the improvement in its profile, risk management is still struggling to gain influence at a strategic level within these institutions. The vast majority of respondents to the survey, at 76%, said that regardless of its raised profile, risk management is still stigmatised as a support function. This seems contradictory to the idea that the desire for better risk management is compelling these institutions to spend on data management. KPMG puts this view down to issues surrounding communication – which can be said to be at the root of the financial crisis itself. Perhaps KPMG’s report on the subject next year will tell a different story? After all, the issue of risk is frequently in the headlines and on the lips of the regulatory community, so how can banks choose to ignore risk management on a strategic level? The Bank for International Settlements (BIS), for example, has recently published its recommendations on risk supervision and stress testing, which highlights what it sees as significant inadequacies in current market practices. The weaknesses in stress testing revealed by the current financial turmoil are underlined by the report in four broad areas: use of stress testing and integration in risk governance; stress testing methodologies; scenario selection; and stress testing of specific risks and products. Risk management in the area of valuations has also been a hot topic within the regulatory community, as indicated by the recent President’s Working Group’s best practices documents. The finalised sets of best practices call for more robust valuations procedures, including a segregation of responsibilities, thorough written policies, oversight and a specific focus on hard to value assets.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Stephan Wolf Steps Down from Role of GLEIF CEO in June 2024

Stephan Wolf, CEO of the Global Legal Entity Identifier Foundation (GLEIF), will step down from the role on 24 June 2024 after a decade of leading the foundation from its start-up phase to the growing organisation it is today. In a post on LinkedIn, Wolf writes: “After a decade of incredible experiences and achievements, I...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...