About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Huge Number of Ad Hoc Regulatory Requests is Adding to the Data Challenge, Says BarCap’s Bhattacharjee

Subscribe to our newsletter

Financial services firms are being faced with a huge number of on the hoof ad hoc reporting requirements from the regulatory community and this is putting significant pressure on their data infrastructures, said Kris Bhattacharjee, global head of regulatory policy and reporting at Barclays Capital. He explained to attendees to Thomson Reuters’ roundtable earlier this week that BarCap has been compelled to invest in its data infrastructure to meet these pressures because the business and regulators are both looking for more granular data.

Firms need to be able to allocate capital “very carefully” in the current climate where liquidity is much scarcer than before the crisis, Bhattacharjee explained, and this means they need reliable and granular data in order to make these assessments. Regulatory compliance has also placed rather onerous requirements on firms and these are set to get even stronger: “The bar is seemingly being raised every six months for financial reporting and there is a whole host of new regulatory requirements on its way.”

The business in the front office and regulatory reporting functions are therefore both adding together to push firms to invest in data management. Bhattacharjee identified a number of key metrics against which these data management systems are being evaluated.

“The timeliness and frequency of data is essential when deciding actions to take with regards to front office positions. It has also come to the fore with regards to regulatory reporting,” he explained. For example, quarterly reporting for regulatory requirements around capital adequacy has become much more frequent; 10 working days for the UK Financial Services Authority, in fact. Moreover, the ad hoc regulatory reports are expected to be produced by firms in as near to real time as possible in some instances, said Bhattacharjee.

Another key metric is accuracy or completeness of data, he continued: “The data provided to end users has a premium for accuracy and completeness. There is less tolerance from regulators about inaccuracy.” Transparency requirements have also meant that completeness of data is an issue: more data is needed, especially around areas such as pricing.

The industry’s move towards a more holistic approach to risk management has meant the granularity of data is much more important. This includes capital usage at position level and assessments of counterparty risk on a portfolio level, he elaborated. Capital calculations under Basel II also require a much more granular approach to risk data, he added.

“These rules are still evolving and firms need to be flexible in their approach to data management in order to be able to meet these new requirements. You may need to slice and dice the data in a different way. There is therefore some degree of future proofing required in data management,” Bhattacharjee said.

Changing requirements such as these in the near term will likely be around stress testing and new living wills regulation, he contended. This represents a demand for information that was not previously required or dealt with in the same manner, such as assessing intercompany exposures.

Bhattacharjee’s recommendation for the future was for firms to be proactive and tackle their data management challenges before they are faced with a regulatory mandate to do so. He concluded by suggesting that best practices should perhaps come from outside of the financial services industry and recommended the recruitment of an individual from an industry in which data management has been tackled to a greater extent such as pharmaceuticals or manufacturing.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Reporting Seen Among Use Cases Benefiting from Cloud-based Data Management for AI

Artificial intelligence is being adopted by financial regulators at pace, putting pressure on the financial institutions that the overseers serve to double down on their reporting capabilities. It’s no surprise to find that the same AI that’s helping regulators can aid organisations in getting those reporting procedures in place. To do so, however, they need...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...