No matter where you turn these days, you’re bombarded with solutions that promise to offer better risk management capabilities and nowhere more so than the data management community. So far this year, the sheer volume of reports linking the idea of risk management with better data management is indicative of this trend in the market. After all, data is an essential piece of the information workflow that enables the effective management of risk.
Last month, I noted the issues raised by the KPMG report, but this month a plethora of other studies on the subject seems to indicate a very different picture. KPMG highlighted that although risk management is becoming more important to institutions, the senior management of these firms is still not taking the risk function seriously enough. However, other recent studies suggest that this is changing for the better. Data management projects are receiving funding as a direct result of the requirement to meet risk management and compliance requirements. Moreover, as the vendor community feels the pressure from its client base to do more for less, it has to come up with compelling reasons to invest in solutions and risk is the obvious low hanging fruit. A recent report by A-Team Group (publishers of Reference Data Review), for example, highlights the industry perception that data practices must change in order to better support risk management. It is the chief risk officers that are driving forward change in the area of data management, according to 89% of the respondents to the survey. Not an easy thing to do if they had no power in their hands, one would assume. This highlights that the KPMG report, which was based on research done in 2008, is likely to indicate a very different picture in its next iteration. According to the respondents to the A-Team Group survey, 66% indicated they were imminently changing their approach to data management to support risk management. Good news for the vendors on the risk bandwagon waiting in the wings. The respondents also highlighted the need for more consistency and timeliness with regards to data management systems. This idea of better data quality and the move towards a more real-time view of data also cropped up at the recent Marcus Evans conference on reference data, held in London. Speakers including Vinay Pampathi, executive director of the technology division of Daiwa Securities SMBC Europe, explained to delegates that the focus of institutions has turned to dealing with the root causes of risk: improving data quality. “Regulatory restrictions and risk management are forcing financial institutions to change their data management practices and the once forgotten back office is now a major area of concern and investment,” Pampathi elaborated. The closing panel agreed that regulations such as Basel II, MiFID and KYC, in combination with the credit crisis, have all assisted in changing the attitude of financial institutions towards data management. However, there is still a “constant battle” to get funding for projects given the current financial crisis, added Pampathi. Data managers cannot take their eyes off the ball when it comes to proving the benefits of these projects to senior management, even if they have realised the dangers posed by getting it wrong. The spectre of risk is also forcing changes to valuations and pricing practices, according to a study by Asset Control. Financial institutions are no longer able to take valuations at face value and, in order to better judge their risk exposure, they must manage the data in a more structured and efficient way. Data availability and the consistency of using a centralised approach are crucial to handling volume growth, the vendor suggests. The key to survival in this current environment is therefore imposing structure and order on processes that in some institutions have been left to their own devices for long periods of time. This is not just a structural change, it is a cultural one and is likely to take some time to see the end results.