About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

All change

Subscribe to our newsletter

There can be no doubt in anyone’s minds that the data management business is undergoing dramatic change. As indicated by the preliminary results from our reader poll this month, the majority of budgets for data management projects are facing cuts, be they significant or otherwise.

Surprisingly, a minority of you have witnessed an increase in these budgets – likely targeted at areas such as reducing counterparty or operational risks, if recent surveys are anything to go by. What these budgets have not done, however, is remain the same. But how are firms coping with this pressure to do more with less? Where are they spending their remaining pennies to stave off market risk in a volatile environment? According to the speakers at this month’s CorpActions 2009 Europe conference, spending on corporate actions projects is continuing, albeit at a slower pace. The risks introduced by errors in the manual processing of corporate actions are finally understood by senior management, after years of working at raising their profile, said JPMorgan’s David Kane. As with any corporate actions conference over the last few years, the problem of a lack of standardisation of practices amongst issuers was raised ad nauseaum during the event. “All roads lead back to the issuer,” said Kane, who described issuers as the bêtes noirs of the corporate actions world. The need for more consensus across the market with regards to the adoption of standards was also mentioned frequently by panellists and audience members alike (no surprises there then). However, what was new to this year’s event was the discussion surrounding the impact of heightened market volatility on the corporate actions process. Kane highlighted this “shark’s tooth” volatility as a key contributor to risk within the sector and stressed that this must be taken seriously by firms’ senior management. The risks posed are significant with regards to operational and reputational factors, after all. It seems that no matter where you turn in the data management market, the issue of risk is being bandied about with enthusiasm. The recent KPMG survey into the profile of risk within banks, however, gives a rather different perspective on the matter. The report indicates that despite the improvement in its profile, risk management is still struggling to gain influence at a strategic level within these institutions. The vast majority of respondents to the survey, at 76%, said that regardless of its raised profile, risk management is still stigmatised as a support function. This seems contradictory to the idea that the desire for better risk management is compelling these institutions to spend on data management. KPMG puts this view down to issues surrounding communication – which can be said to be at the root of the financial crisis itself. Perhaps KPMG’s report on the subject next year will tell a different story? After all, the issue of risk is frequently in the headlines and on the lips of the regulatory community, so how can banks choose to ignore risk management on a strategic level? The Bank for International Settlements (BIS), for example, has recently published its recommendations on risk supervision and stress testing, which highlights what it sees as significant inadequacies in current market practices. The weaknesses in stress testing revealed by the current financial turmoil are underlined by the report in four broad areas: use of stress testing and integration in risk governance; stress testing methodologies; scenario selection; and stress testing of specific risks and products. Risk management in the area of valuations has also been a hot topic within the regulatory community, as indicated by the recent President’s Working Group’s best practices documents. The finalised sets of best practices call for more robust valuations procedures, including a segregation of responsibilities, thorough written policies, oversight and a specific focus on hard to value assets.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data Standards & Identifiers: Where are they helping and what more can be done?

Beyond regulatory compliance, what are the opportunities for leveraging standards to improve operational efficiencies? Financial institutions are starting to realise there are clear benefits in taking a strategic approach to data standardisation as they move to more data driven approaches which require good quality, accurate data for analytics and AI programmes. This webinar will review...

BLOG

Data Quality Still Troubling Private Market Investors: Webinar Review

Obtaining and managing data remains a sticking point for investors in private and alternative assets as financial institutions sink more of their capital into the markets. In a poll of viewers during a recent A-Team LIVE Data Management Insight webinar, respondents said the single-biggest challenge to managing private markets data was a lack of transparency...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Corporate Actions

Corporate actions has been a popular topic of discussion over the last few months, with the DTCC’s plans for XBRL and ISO interoperability, as well as the launch of Swift’s new self-testing service for corporate actions messaging, STaQS, among others. However, it has not been a good start to the year for many of the...