About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Citigroup Fine Shows Importance of Having Robust Data Setup

Subscribe to our newsletter

The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes.

The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the international banking group after it was found to have put in place insufficient data management risk controls. Further, the group was told to hold quarterly checks to ensure it has safeguards in place.

The action has been seen a warning that regulators will take a tough stance against data management failings that could have a detrimental impact on banks’ clients and their business. Charlie Browne, head of market data, quant and risk solutions at data enterprise data management services provider GoldenSource, said the fine shows that there can be no hiding bad practices.

Greater Scrutiny

“Citigroup’s fine should be a warning to other banks and institutions who may have believed their insufficient data and risk controls could fly under the radar,” Browne told Data Management Insight. “It’s time to adapt, or be forced to pay up.”

Financial institution’s data management structures are likely to come under greater regulatory scrutiny to protect customers as more of their activities are digitalised, as artificial intelligence is incorporated into tech systems and amid growing acceptance of crypto finance.

As well as data privacy protection measures, organisations will be expected to tighten controls on many other data domains including trading information and ESG reporting. The fallout from the collapse of Silicon Valley Bank last year will also put pressure on lenders’ solvency requirements and crisis management, processes that are heavily data-dependent.

Data Care

Browne said the penalty imposed on Citigroup showed that institutions had to take greater care with their data and controls models because regulators are very aware of how important digital information is to the efficient running of all parts of an enterprise’s operations.

This fining of Citigroup demonstrates the very real costs associated with banks not being on top of their risk controls and data management,” he said.

“It’s a bold statement from the US rule makers that banks showing complacency about their data issues will be met with regulatory action. Regulators globally are now coming to the understanding that it’s fundamental that financial institutions have effective data management strategies.”

While breaches of Europe’s General Data Protection Regulation (GDPR) and anti-money laundering rules have already been at the root of fines imposed on banks and financial services firms, penalties related to operational use of data are expected to grow.

For example, institutions interviewed by A-Team Group have regularly said they are closely examining the data privacy and IP implications of using outputs from generative AI applications. The concern they have is that the content generated will be in beach of copywriter material on which the model has been trained.

Non-Negotiable

Browne’s comments were echoed by the found and chief executive of Monte Carlo Data Barr Moses, who said that as data needs become central to firms’ operations, “data quality becomes non-negotiable”.

“In 2024 data quality isn’t open for discussion — it’s a clear and present risk and it needs our attention,” Moses wrote on LinkedIn.

Browne said that ensuring compliance will require strenuous efforts by organisations to go deep into their data capabilities and processes.

“Data quality and accessibility are, rightly, front of mind, however, it’s also vital that banks consider concepts like data governance and data lineage when assessing the efficiency of their systems and adequately managing their risk. Being able to track data back to source is an important tool that rule makers are increasingly looking to demand of banks, visible in regulations like the ECB’s Risk Data Aggregation and Risk Reporting (RDARR) measures.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Standards and Identifiers Help to Prevent ‘Data Chaos’: Webinar Preview

Financial institutions’ absorption of ever-greater volumes of data, and their utilisation of it in a surging number of use cases, is putting strains on their data management processes. Taking the friction out of those workflows can improve performance substantially. But the absence of a unified international set of standards to ensure all data used by...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...