About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Huge Number of Ad Hoc Regulatory Requests is Adding to the Data Challenge, Says BarCap’s Bhattacharjee

Subscribe to our newsletter

Financial services firms are being faced with a huge number of on the hoof ad hoc reporting requirements from the regulatory community and this is putting significant pressure on their data infrastructures, said Kris Bhattacharjee, global head of regulatory policy and reporting at Barclays Capital. He explained to attendees to Thomson Reuters’ roundtable earlier this week that BarCap has been compelled to invest in its data infrastructure to meet these pressures because the business and regulators are both looking for more granular data.

Firms need to be able to allocate capital “very carefully” in the current climate where liquidity is much scarcer than before the crisis, Bhattacharjee explained, and this means they need reliable and granular data in order to make these assessments. Regulatory compliance has also placed rather onerous requirements on firms and these are set to get even stronger: “The bar is seemingly being raised every six months for financial reporting and there is a whole host of new regulatory requirements on its way.”

The business in the front office and regulatory reporting functions are therefore both adding together to push firms to invest in data management. Bhattacharjee identified a number of key metrics against which these data management systems are being evaluated.

“The timeliness and frequency of data is essential when deciding actions to take with regards to front office positions. It has also come to the fore with regards to regulatory reporting,” he explained. For example, quarterly reporting for regulatory requirements around capital adequacy has become much more frequent; 10 working days for the UK Financial Services Authority, in fact. Moreover, the ad hoc regulatory reports are expected to be produced by firms in as near to real time as possible in some instances, said Bhattacharjee.

Another key metric is accuracy or completeness of data, he continued: “The data provided to end users has a premium for accuracy and completeness. There is less tolerance from regulators about inaccuracy.” Transparency requirements have also meant that completeness of data is an issue: more data is needed, especially around areas such as pricing.

The industry’s move towards a more holistic approach to risk management has meant the granularity of data is much more important. This includes capital usage at position level and assessments of counterparty risk on a portfolio level, he elaborated. Capital calculations under Basel II also require a much more granular approach to risk data, he added.

“These rules are still evolving and firms need to be flexible in their approach to data management in order to be able to meet these new requirements. You may need to slice and dice the data in a different way. There is therefore some degree of future proofing required in data management,” Bhattacharjee said.

Changing requirements such as these in the near term will likely be around stress testing and new living wills regulation, he contended. This represents a demand for information that was not previously required or dealt with in the same manner, such as assessing intercompany exposures.

Bhattacharjee’s recommendation for the future was for firms to be proactive and tackle their data management challenges before they are faced with a regulatory mandate to do so. He concluded by suggesting that best practices should perhaps come from outside of the financial services industry and recommended the recruitment of an individual from an industry in which data management has been tackled to a greater extent such as pharmaceuticals or manufacturing.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Turning Regulation into an Advantage for UK Financial Sector SMEs

By Jon Lucas, Director and Co-Founder, Hyve Managed Hosting. While security and compliance have always been crucial pillars of cloud hosting, the landscape is shifting. New legislation and stricter regulatory frameworks are placing heavier demands on businesses – particularly in sectors like financial services – forcing companies to invest more time, and resources into ticking...

EVENT

Data Licensing Forum 2025 NYC

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...