The level of uncertainty in the industry due to the ongoing debate within the regulatory community with regards to new risk management reporting requirements has proved to be both a blessing and a curse to those in the data business. Panellists at this month’s FS Club agreed that the industry is being forced to take data more seriously but there are significant gaps around risk related regulation that may prove to be pitfalls in the near future.
PJ Di Giammarino, CEO of think tank JWG and chair of the debate, explained that the industry is facing “an awful lot of unknowns” with regards to risk reporting under the premise of Basel III. He pointed to 10 documents on the subject of risk regulation that are currently open for comment in the market, including six papers from the Committee of European Banking Supervisors (CEBS), as proof of the “regulatory tsunami” that the industry is being forced to respond to.
In order to illustrate the challenges further, Di Giammarino discussed the data related requirements of the Basel Committee on Banking Supervision’s (BCBS) recent liquidity risk paper and its liquidity coverage ratio (LCR). He highlighted the requirement for firms to provide regulators with contractual mismatch data in order to satisfy the LCR. “This requirement includes providing the regulator with raw data with no assumptions included,” he said.
Panellists noted the danger of being forced to provide “raw data” to the regulators due to the possibility of misinterpretation. Di Giammarino compared it to handing the regulators a “loaded gun”. Julia Sutton, global head of reference data at the Royal Bank of Canada (RBC), however, felt the regulators would be hard pushed to make use of raw data, given that firms have to cleanse it first to make it of any use.
Numerous other requirements were touched upon by the panellists, including the requirement to gather together counterparty data in order to produce concentration of funding reports to the regulator. Di Giammarino summed up the various requirements by indicating that all of these add up to one overwhelming fact: “the data supply chain needs to change”.
The regulatory community in the US seems keen to step in to deal with the data details, he added, and this is evidenced by the recent finance bill and its recommendations for the establishment of an Office of Financial Research. The new body would therefore have a data centre that would be charged with collecting, validating and maintaining all the relevant industry data required to assess systemic risk, which it would then provide back to the industry.
However, this challenge is made problematic by a series of “first mile” issues, agreed panellists. These include the lack of standardisation of data formats across the industry, the issue of measuring quality and timeliness of the data and the old problem of “garbage in, gospel out”, noted Di Giammarino.
Bill Rickard, head of strategy and policy at Royal Bank of Scotland (RBS) Group Treasury, noted that regulators would be asking for formats for this data that suit their requirements rather than those that suit the industry. Of course, standard reporting formats across the globe would be welcomed by those operating across markets, but getting too involved in the data formats for business use could be dangerous.
Sutton noted that overall, this focus on data from the top has proved beneficial in the effort to get data management project funding but stressed that much more needs to be done. “Some firms have grasped the nettle of data management but a lot of effort needs to go into getting it right. After all, the industry will be in a much better place to respond to new regulatory requirements if its data foundations are solid,” she concluded.
Subscribe to our newsletter