About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulators Double Down on BCBS 239 Data Quality Focus

Subscribe to our newsletter

The European Central Bank (ECB) and other financial services regulators will increase their focus on improving the quality of data they receive from financial institutions over the next 24 months. Speakers at the Vermeg Annual Regulatory Reporting Conference, held in London in late June, said regulators are analysing the quality of the Basel III data reports they receive more deeply, and are seeking to engage with the industry on BCBS 239 compliance. Ongoing concerns about data quality mean increased enforcement is expected to be a tool that regulators will use very soon.

At the moment, financial services firms are responsible for the quality of the data they send to fulfil Basel III reporting requirements – regulators expect them to send in files that are 100% correct and on time. However, it’s no secret that supervisors have been disappointed by the quality of the data they are receiving. There is a sense that banks are not adequately reviewing regulatory data reports before they are sent to the supervisors. For example, an attendee at the conference noted that one bank had to resubmit a set of regulatory data 74 times before it was correct. Issues include empty fields, fields filled in with data that is obviously wrong, and data revisions without explanation by firms.

This concern over data quality is borne out by statistics. The European Central Bank posts information about the quality of data it receives from banks on its website, and this makes for eye-opening reading. Some 59% of submitting firms turned in regulatory reports with at least one failing validation rule in the fourth quarter of 2018. For the same period, almost 7% of data points were missing from these reports. This, is in spite of the ECB already giving the issue of data quality considerable focus.

In February 2017, the regulator communicated its expectations for both internal modelling and data quality at firms. IT followed this up with on-site inspections, in which it found most firms approach to data quality to be disappointing – practices at some firms were so poor that they received significant regulatory capital add-ons. In September 2018, the ECB published a consultation that strengthened requirements for governance, systems and processes around data quality.

Attendees at the event clearly expect the ECB to raise its game in forthcoming inspections of BCBS 239 regulatory reporting capabilities. To begin with, the ECB has a framework that it is actively using to assess firms’ approach to data quality. Supervisors will review firms’ approach with an eye to punctuality, accuracy, completeness, stability, plausibility, and reliability. The first three of these are considered to be ‘hard’ checks, in that they can be defined by metrics. The second three are ‘soft’ checks, which require more regulatory judgement and discretion.

Going forward, it will be up to firms to demonstrate that they are taking reasonable steps to produce accurate regulatory reporting, to the ECB. Firms that fall short can expect to enter into more intensive engagement with the ECB on the matter – essentially, a five-step escalation process. Regulators will hold discussions with firms that are failing to meet the required standard, and give them deadlines to resolve outstanding issues. If that fails to produce results, a letter will be sent to either the chief financial officer or the chief risk officer. The letter, again, will have a clear deadline by which the identified issues are expected to be resolved. If the issues are not fixed, a similar letter will go to the CEO or chair of the audit committee. If this too fails to elicit the right responses, then the ECB will move to impose sanctions, including the possibility of both fines and naming and shaming institutions.

Attendees at the event suggested it was possible that the ECB could engage in a number of targeted enforcement actions over the next 12 months to encourage firms to get their BCBS 239 data quality houses in order.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

Data’s Evolution Continues From Cost to Core Asset: DMS New York City 2025 Preview

Modern Chief Data Officers are not only the guardians of financial institutions’ data estates, they are also the caretakers of their single-biggest asset. With every part of an organisation’s business now dependent on data, the custody of its digital information is every bit as critical to operations as the management of trading teams or even...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...