About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Huge Number of Ad Hoc Regulatory Requests is Adding to the Data Challenge, Says BarCap’s Bhattacharjee

Subscribe to our newsletter

Financial services firms are being faced with a huge number of on the hoof ad hoc reporting requirements from the regulatory community and this is putting significant pressure on their data infrastructures, said Kris Bhattacharjee, global head of regulatory policy and reporting at Barclays Capital. He explained to attendees to Thomson Reuters’ roundtable earlier this week that BarCap has been compelled to invest in its data infrastructure to meet these pressures because the business and regulators are both looking for more granular data.

Firms need to be able to allocate capital “very carefully” in the current climate where liquidity is much scarcer than before the crisis, Bhattacharjee explained, and this means they need reliable and granular data in order to make these assessments. Regulatory compliance has also placed rather onerous requirements on firms and these are set to get even stronger: “The bar is seemingly being raised every six months for financial reporting and there is a whole host of new regulatory requirements on its way.”

The business in the front office and regulatory reporting functions are therefore both adding together to push firms to invest in data management. Bhattacharjee identified a number of key metrics against which these data management systems are being evaluated.

“The timeliness and frequency of data is essential when deciding actions to take with regards to front office positions. It has also come to the fore with regards to regulatory reporting,” he explained. For example, quarterly reporting for regulatory requirements around capital adequacy has become much more frequent; 10 working days for the UK Financial Services Authority, in fact. Moreover, the ad hoc regulatory reports are expected to be produced by firms in as near to real time as possible in some instances, said Bhattacharjee.

Another key metric is accuracy or completeness of data, he continued: “The data provided to end users has a premium for accuracy and completeness. There is less tolerance from regulators about inaccuracy.” Transparency requirements have also meant that completeness of data is an issue: more data is needed, especially around areas such as pricing.

The industry’s move towards a more holistic approach to risk management has meant the granularity of data is much more important. This includes capital usage at position level and assessments of counterparty risk on a portfolio level, he elaborated. Capital calculations under Basel II also require a much more granular approach to risk data, he added.

“These rules are still evolving and firms need to be flexible in their approach to data management in order to be able to meet these new requirements. You may need to slice and dice the data in a different way. There is therefore some degree of future proofing required in data management,” Bhattacharjee said.

Changing requirements such as these in the near term will likely be around stress testing and new living wills regulation, he contended. This represents a demand for information that was not previously required or dealt with in the same manner, such as assessing intercompany exposures.

Bhattacharjee’s recommendation for the future was for firms to be proactive and tackle their data management challenges before they are faced with a regulatory mandate to do so. He concluded by suggesting that best practices should perhaps come from outside of the financial services industry and recommended the recruitment of an individual from an industry in which data management has been tackled to a greater extent such as pharmaceuticals or manufacturing.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Sponsored by FundGuard: NAV Resilience Under DORA, A Year of Lessons Learned

The EU’s Digital Operational Resilience Act (DORA) came into force a year ago, and is reshaping how asset managers, asset owners and fund service providers think about operational risk. While DORA’s focus is squarely on ICT resilience and third-party dependencies, its implications extend deep into core operational processes that are critical to market integrity, investor...

BLOG

The Year in Data: 2025’s Biggest Trends and Developments

The past 12 months saw breakneck developments in how firms applied artificial intelligence. AI began to change from a mere tool to an integral part of capital markets operations. The year also saw data services providers launch multiple products for the growing private markets investment sector. Data Management Insight spoke to leaders in our industry...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...