Industry think tank JWG has just completed a research project into the industry’s level of preparation for the risk management challenges involved in the incoming tsunami of regulatory requirements that indicates a lot still needs to be done to get the right level of risk data consistency and accuracy. Conducted over the summer months and involving 107 industry professionals, the research highlights the pain of aggregating risk data across risk types and business lines to meet the requirements included in the 30,000 pages of risk regulations, explains PJ Di Giammarino, CEO of JWG.
“The business proposition for better risk management remains unclear and there is still a lot of box ticking going on within the industry rather than an alignment of business, risk and regulatory compliance incentives,” he says. “Another key part of the equation is getting the risk information right and our research highlighted the many aches and pains of trying to produce workable risk data sets.”
Di Giammarino highlights what he calls the requirement for a “know your exposure framework” in which to structure this data in order to understand “what good risk exposure management looks like” (a subject he has been focused on for some time). Although the industry is beginning to realise the need for a more holistic approach to risk and the data involved, it has not yet made significant moves in this direction, according to the JWG research. Di Giammarino predicts the sprint for the best capital ratios and all the work involved therein will begin in earnest in the latter half of next year.
The ultimate drivers will be firms’ fear around greater penalties for inadequate risk infrastructures nearer to the 2013 deadline, but the conversations about how to reach compliance with the many incoming risk requirements are beginning now. “Today’s aggregation and funds transfer pricing requirements will guide the development of tomorrow’s architectures,” he explains.
Data and reporting requirements are one of the five main pillars of the JWG know your exposure capability model, which also includes: strategy and financial resources; model requirements; governance and assessment; and supervisory review. The data requirements are tied into the multiple qualitative and quantitative reports that are being demanded of financial institutions as a result of regulations around areas such as Basel reporting and liquidity risk. Di Giammarino points to the increased frequency and granularity of these reports as a significant driver for investment in technology for the near future.
“The bigger the financial institution, the harder it will be to be able to aggregate this risk data. However, even smaller firms will struggle due to the integrated nature of the reports required to be produced from architectures that are siloed,” he elaborates. It is an aggregation problem for which the solution must also consider all of the different constituents that may need access to this data, which range from corporate treasury to CEOs or accounting teams. “Very few oversight functions within a financial institution are unaffected,” says Di Giammarino.
The focus is not just on meeting compliance requirements; the future profitability of a firm’s business lines is also predicated on meeting these challenges. According to the JWG research, the priorities for the business and compliance are therefore similar: the need for accurate funds transfer pricing; credit and debt valuation adjustment; a firm-wide risk dashboard; compensation modelling; and more accurate pricing and valuations. Di Giammarino contends that firms should make an effort to understand where business and compliance initiatives overlap, as well as determining where they are with regards to their peers.
As noted by many risk and data practitioners before, the technical challenges of ensuring the quality and accuracy of this data run deep and the aggregation challenge across silos is far from trivial. The downstream uses of the risk data must also be taken into account in the design of any kind of enterprise data management system in order to determine what data is needed where. “Firms need to think proactively about these data requirements and this will pose a significant cultural challenge,” says Di Giammarino. “This is why governance and the development of an overall risk data strategy is important.”
In terms of the top issues identified by the JWG research in the area of data management, one of the main challenges going forward will therefore be the ability to manipulate, view and report the data in many different ways for different users. Firms will also be faced with the processing challenges resulting from increased data volumes and scale, as well as bringing together multiple versions of the “truth”. Metrics for data quality measurement will be key, as will ensuring technology gaps do not cause the loss of information from front to back office systems.
Di Giammarino points to reference data golden copy as a key issue going forward; it seems risk management will remain a key driver for investment to this end for some time to come.