About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

All change

Subscribe to our newsletter

There can be no doubt in anyone’s minds that the data management business is undergoing dramatic change. As indicated by the preliminary results from our reader poll this month, the majority of budgets for data management projects are facing cuts, be they significant or otherwise.

Surprisingly, a minority of you have witnessed an increase in these budgets – likely targeted at areas such as reducing counterparty or operational risks, if recent surveys are anything to go by. What these budgets have not done, however, is remain the same. But how are firms coping with this pressure to do more with less? Where are they spending their remaining pennies to stave off market risk in a volatile environment? According to the speakers at this month’s CorpActions 2009 Europe conference, spending on corporate actions projects is continuing, albeit at a slower pace. The risks introduced by errors in the manual processing of corporate actions are finally understood by senior management, after years of working at raising their profile, said JPMorgan’s David Kane. As with any corporate actions conference over the last few years, the problem of a lack of standardisation of practices amongst issuers was raised ad nauseaum during the event. “All roads lead back to the issuer,” said Kane, who described issuers as the bêtes noirs of the corporate actions world. The need for more consensus across the market with regards to the adoption of standards was also mentioned frequently by panellists and audience members alike (no surprises there then). However, what was new to this year’s event was the discussion surrounding the impact of heightened market volatility on the corporate actions process. Kane highlighted this “shark’s tooth” volatility as a key contributor to risk within the sector and stressed that this must be taken seriously by firms’ senior management. The risks posed are significant with regards to operational and reputational factors, after all. It seems that no matter where you turn in the data management market, the issue of risk is being bandied about with enthusiasm. The recent KPMG survey into the profile of risk within banks, however, gives a rather different perspective on the matter. The report indicates that despite the improvement in its profile, risk management is still struggling to gain influence at a strategic level within these institutions. The vast majority of respondents to the survey, at 76%, said that regardless of its raised profile, risk management is still stigmatised as a support function. This seems contradictory to the idea that the desire for better risk management is compelling these institutions to spend on data management. KPMG puts this view down to issues surrounding communication – which can be said to be at the root of the financial crisis itself. Perhaps KPMG’s report on the subject next year will tell a different story? After all, the issue of risk is frequently in the headlines and on the lips of the regulatory community, so how can banks choose to ignore risk management on a strategic level? The Bank for International Settlements (BIS), for example, has recently published its recommendations on risk supervision and stress testing, which highlights what it sees as significant inadequacies in current market practices. The weaknesses in stress testing revealed by the current financial turmoil are underlined by the report in four broad areas: use of stress testing and integration in risk governance; stress testing methodologies; scenario selection; and stress testing of specific risks and products. Risk management in the area of valuations has also been a hot topic within the regulatory community, as indicated by the recent President’s Working Group’s best practices documents. The finalised sets of best practices call for more robust valuations procedures, including a segregation of responsibilities, thorough written policies, oversight and a specific focus on hard to value assets.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers being used increasingly for the benefit of the business. This webinar will survey the landscape of...

BLOG

Better Data, Better Business: Combat Identity-Related Fraud with the LEI

By Clare Rowley, Head of Business Operations at the Global Legal Entity Identifier Foundation (GLEIF). The global economy is wrestling with never-before-seen levels of identity-related fraud. Cybercrime costs in the US reached an estimated $320 billion as of 2023, according to Statista. Between 2017 and 2023, this figure has seen a significant increase of over...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Corporate Actions USA 2010

The US corporate actions market has long been characterised as paper-based and manually intensive, but it seems that much progress is being made of late to tackle the lack of automation due to the introduction of four little letters: XBRL. According to a survey by the American Institute of Certified Public Accountants (AICPA) and standards...