Financial services firms are being faced with a huge number of on the hoof ad hoc reporting requirements from the regulatory community and this is putting significant pressure on their data infrastructures, said Kris Bhattacharjee, global head of regulatory policy and reporting at Barclays Capital. He explained to attendees to Thomson Reuters’ roundtable earlier this week that BarCap has been compelled to invest in its data infrastructure to meet these pressures because the business and regulators are both looking for more granular data.
Firms need to be able to allocate capital “very carefully” in the current climate where liquidity is much scarcer than before the crisis, Bhattacharjee explained, and this means they need reliable and granular data in order to make these assessments. Regulatory compliance has also placed rather onerous requirements on firms and these are set to get even stronger: “The bar is seemingly being raised every six months for financial reporting and there is a whole host of new regulatory requirements on its way.”
The business in the front office and regulatory reporting functions are therefore both adding together to push firms to invest in data management. Bhattacharjee identified a number of key metrics against which these data management systems are being evaluated.
“The timeliness and frequency of data is essential when deciding actions to take with regards to front office positions. It has also come to the fore with regards to regulatory reporting,” he explained. For example, quarterly reporting for regulatory requirements around capital adequacy has become much more frequent; 10 working days for the UK Financial Services Authority, in fact. Moreover, the ad hoc regulatory reports are expected to be produced by firms in as near to real time as possible in some instances, said Bhattacharjee.
Another key metric is accuracy or completeness of data, he continued: “The data provided to end users has a premium for accuracy and completeness. There is less tolerance from regulators about inaccuracy.” Transparency requirements have also meant that completeness of data is an issue: more data is needed, especially around areas such as pricing.
The industry’s move towards a more holistic approach to risk management has meant the granularity of data is much more important. This includes capital usage at position level and assessments of counterparty risk on a portfolio level, he elaborated. Capital calculations under Basel II also require a much more granular approach to risk data, he added.
“These rules are still evolving and firms need to be flexible in their approach to data management in order to be able to meet these new requirements. You may need to slice and dice the data in a different way. There is therefore some degree of future proofing required in data management,” Bhattacharjee said.
Changing requirements such as these in the near term will likely be around stress testing and new living wills regulation, he contended. This represents a demand for information that was not previously required or dealt with in the same manner, such as assessing intercompany exposures.
Bhattacharjee’s recommendation for the future was for firms to be proactive and tackle their data management challenges before they are faced with a regulatory mandate to do so. He concluded by suggesting that best practices should perhaps come from outside of the financial services industry and recommended the recruitment of an individual from an industry in which data management has been tackled to a greater extent such as pharmaceuticals or manufacturing.
Subscribe to our newsletter