About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Tackling the TRIM Challenge – How Banks Can Get their Data Quality Processes up to Scratch

Subscribe to our newsletter

By Martijn Groot, Vice President of Product Management, Asset Control

The Targeted Review of Internal Models (TRIM) is underway and significantly impacting banks across the eurozone. TRIM is an initiative of the European Central Bank (ECB), designed to assess whether the internal risk assessment models used by banks supervised by the ECB, comply with regulatory requirements and whether their results are reliable and comparable.

As part of the programme, the ECB is engaged in a process of reviewing the banks’ models, providing them with ‘homework’ to improve their processes, and then returning to inspect again. In carrying out this process, however, the ECB understands that detailed discussions with the banks about their risk assessment models will be of little value if they can’t trust the data that is fed into them.

Data Quality Principles

TRIM can be said to build on the results of the Basel Committee for Banking Supervision’s BCBS 239 document, published in 2013. While BCBS 239 laid out 14 principles for risk data aggregation for banks to abide by, it was quite generic in nature. TRIM is more specific, especially around data quality aspects and measurements.

In fact, TRIM provides a range of governance principles for developing a data quality framework that covers relevant data quality dimensions including completeness, timeliness, accuracy, consistency and traceability.

To comply with TRIM, banks need to show that they can trace back the price they have used historically for a model or for a financial instrument valuation through the data supply chain back to original sources. They also need to know what processes have been carried out on the data, including checks that have been conducted, what the sources are, what the original parameters and data quality rules were, and whether they have been changed over time? Traceability is the term used to describe this in the TRIM document but data lineage, effectively the data lifecycle that includes the data’s origins and where it moves over time, is the broader term more widely used in the data management arena.

The TRIM document also contains important reporting guidelines – including that banks will need to report on how often they have proxied their market data inputs or risk calculations. Doing this also defines a process for how a bank has derived and validated this proxy. Is it really a comparable instrument? Does it behave similarly to the original instrument?

In other words, in line with the focus on data quality in TRIM, it is important that banks are regularly validating their proxies. Finally, and to ensure they have a better grasp of the quality of the market data they use in risk calculations, banks need to have a handle on how much data is stale per asset class.

Today, most banks would typically struggle to comply with many of the data quality guidelines that TRIM puts in place. Most have no data quality or control frameworks in place or, at best, assess quality in different isolated silos. As such, they don’t have the ability to report daily on key data and metrics. They may have implemented checks and controls, but generally they have little real insight into data across the whole chain. Very few have a full audit trail in place that describes how data flows from sources through quality checks and workflows into the financial models, and that tracks not just data values, but also the rules and the rule parameters that have acted on the data.

Finding a Way Forward

So, how can banks effectively meet the TRIM guidelines? Banks first need to get the basic processes right. That means putting a robust data governance and data quality framework in place. To do that, they need to document their data management principles and policies. They also need to agree on a common data dictionary and understand more clearly exactly what they are measuring, including how they define financial products across the group and the control model for the whole lifecycle.

The next stage is putting in place the technology that enables banks to achieve this. Organisations first need a data management system that has the end-to-end capability to gather, integrate and master key data, derive risk factors and publish them to different groups. This should provide banks with a single funnel and consistent set of data and data quality metrics that support TRIM compliance.

It is worth highlighting, too, that there are benefits on offer for banks that go beyond simply complying with TRIM – important though that is. Some of the remediation that they will have to do to comply will also be required for key regulations, including the Fundamental Review of the Trading Book (FRTB). However, for many, TRIM is their current focus and with the programme expected to run to 2020 only, banks know there is still much work to do to meet its guidelines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for buy-side data management across structured and unstructured data

Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step up customer acquisition and compliance, and ultimately, gain competitive advantage in a market characterised by tight...

BLOG

Rethinking ‘Cloud First’: Why IT Leaders are Repatriating from Public Cloud

By Stewart Laing, chief executive, Asanti. Since 2013 “Cloud First” has been the guiding mantra for countless IT leaders, promising agility, scalability, and a low barrier to entry. Public cloud providers like AWS, Microsoft Azure and Google Cloud Platform have touted their services as one-size-fits-all solutions, offering organisations access to cutting-edge technology without substantial upfront...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...