About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Huge Number of Ad Hoc Regulatory Requests is Adding to the Data Challenge, Says BarCap’s Bhattacharjee

Subscribe to our newsletter

Financial services firms are being faced with a huge number of on the hoof ad hoc reporting requirements from the regulatory community and this is putting significant pressure on their data infrastructures, said Kris Bhattacharjee, global head of regulatory policy and reporting at Barclays Capital. He explained to attendees to Thomson Reuters’ roundtable earlier this week that BarCap has been compelled to invest in its data infrastructure to meet these pressures because the business and regulators are both looking for more granular data.

Firms need to be able to allocate capital “very carefully” in the current climate where liquidity is much scarcer than before the crisis, Bhattacharjee explained, and this means they need reliable and granular data in order to make these assessments. Regulatory compliance has also placed rather onerous requirements on firms and these are set to get even stronger: “The bar is seemingly being raised every six months for financial reporting and there is a whole host of new regulatory requirements on its way.”

The business in the front office and regulatory reporting functions are therefore both adding together to push firms to invest in data management. Bhattacharjee identified a number of key metrics against which these data management systems are being evaluated.

“The timeliness and frequency of data is essential when deciding actions to take with regards to front office positions. It has also come to the fore with regards to regulatory reporting,” he explained. For example, quarterly reporting for regulatory requirements around capital adequacy has become much more frequent; 10 working days for the UK Financial Services Authority, in fact. Moreover, the ad hoc regulatory reports are expected to be produced by firms in as near to real time as possible in some instances, said Bhattacharjee.

Another key metric is accuracy or completeness of data, he continued: “The data provided to end users has a premium for accuracy and completeness. There is less tolerance from regulators about inaccuracy.” Transparency requirements have also meant that completeness of data is an issue: more data is needed, especially around areas such as pricing.

The industry’s move towards a more holistic approach to risk management has meant the granularity of data is much more important. This includes capital usage at position level and assessments of counterparty risk on a portfolio level, he elaborated. Capital calculations under Basel II also require a much more granular approach to risk data, he added.

“These rules are still evolving and firms need to be flexible in their approach to data management in order to be able to meet these new requirements. You may need to slice and dice the data in a different way. There is therefore some degree of future proofing required in data management,” Bhattacharjee said.

Changing requirements such as these in the near term will likely be around stress testing and new living wills regulation, he contended. This represents a demand for information that was not previously required or dealt with in the same manner, such as assessing intercompany exposures.

Bhattacharjee’s recommendation for the future was for firms to be proactive and tackle their data management challenges before they are faced with a regulatory mandate to do so. He concluded by suggesting that best practices should perhaps come from outside of the financial services industry and recommended the recruitment of an individual from an industry in which data management has been tackled to a greater extent such as pharmaceuticals or manufacturing.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Data Standards Bring Many Gains (If You Have the Right Setup): Webinar Review

Standards and identifiers are helping to improve the quality of data used by capital market participants, but organisations with legacy architectures are finding it challenging to capitalise on those benefits, according to polls by A-Team Group. Half of respondents to surveys held during a recent A-Team Group Data Management Insight webinar said that data standardisation...

EVENT

Eagle Alpha Alternative Data Conference, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...