About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling the TRIM Challenge – How Banks Can Get their Data Quality Processes up to Scratch

Subscribe to our newsletter

By Martijn Groot, Vice President of Product Management, Asset Control

The Targeted Review of Internal Models (TRIM) is underway and significantly impacting banks across the eurozone. TRIM is an initiative of the European Central Bank (ECB), designed to assess whether the internal risk assessment models used by banks supervised by the ECB, comply with regulatory requirements and whether their results are reliable and comparable.

As part of the programme, the ECB is engaged in a process of reviewing the banks’ models, providing them with ‘homework’ to improve their processes, and then returning to inspect again. In carrying out this process, however, the ECB understands that detailed discussions with the banks about their risk assessment models will be of little value if they can’t trust the data that is fed into them.

Data Quality Principles

TRIM can be said to build on the results of the Basel Committee for Banking Supervision’s BCBS 239 document, published in 2013. While BCBS 239 laid out 14 principles for risk data aggregation for banks to abide by, it was quite generic in nature. TRIM is more specific, especially around data quality aspects and measurements.

In fact, TRIM provides a range of governance principles for developing a data quality framework that covers relevant data quality dimensions including completeness, timeliness, accuracy, consistency and traceability.

To comply with TRIM, banks need to show that they can trace back the price they have used historically for a model or for a financial instrument valuation through the data supply chain back to original sources. They also need to know what processes have been carried out on the data, including checks that have been conducted, what the sources are, what the original parameters and data quality rules were, and whether they have been changed over time? Traceability is the term used to describe this in the TRIM document but data lineage, effectively the data lifecycle that includes the data’s origins and where it moves over time, is the broader term more widely used in the data management arena.

The TRIM document also contains important reporting guidelines – including that banks will need to report on how often they have proxied their market data inputs or risk calculations. Doing this also defines a process for how a bank has derived and validated this proxy. Is it really a comparable instrument? Does it behave similarly to the original instrument?

In other words, in line with the focus on data quality in TRIM, it is important that banks are regularly validating their proxies. Finally, and to ensure they have a better grasp of the quality of the market data they use in risk calculations, banks need to have a handle on how much data is stale per asset class.

Today, most banks would typically struggle to comply with many of the data quality guidelines that TRIM puts in place. Most have no data quality or control frameworks in place or, at best, assess quality in different isolated silos. As such, they don’t have the ability to report daily on key data and metrics. They may have implemented checks and controls, but generally they have little real insight into data across the whole chain. Very few have a full audit trail in place that describes how data flows from sources through quality checks and workflows into the financial models, and that tracks not just data values, but also the rules and the rule parameters that have acted on the data.

Finding a Way Forward

So, how can banks effectively meet the TRIM guidelines? Banks first need to get the basic processes right. That means putting a robust data governance and data quality framework in place. To do that, they need to document their data management principles and policies. They also need to agree on a common data dictionary and understand more clearly exactly what they are measuring, including how they define financial products across the group and the control model for the whole lifecycle.

The next stage is putting in place the technology that enables banks to achieve this. Organisations first need a data management system that has the end-to-end capability to gather, integrate and master key data, derive risk factors and publish them to different groups. This should provide banks with a single funnel and consistent set of data and data quality metrics that support TRIM compliance.

It is worth highlighting, too, that there are benefits on offer for banks that go beyond simply complying with TRIM – important though that is. Some of the remediation that they will have to do to comply will also be required for key regulations, including the Fundamental Review of the Trading Book (FRTB). However, for many, TRIM is their current focus and with the programme expected to run to 2020 only, banks know there is still much work to do to meet its guidelines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

The Potential and Practicalities of Implementing Generative AI for Compliance

While AI has been around for 20 years or so, its time has come in capital markets with Generative AI and large language models (LLMs) able to handle vast volumes of compliance data and achieve outcomes that cannot be reached by humans. GenAI apps are not, however, a silver bullet, and compliance teams are not...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...