About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Citigroup Fine Shows Importance of Having Robust Data Setup

Subscribe to our newsletter

The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes.

The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the international banking group after it was found to have put in place insufficient data management risk controls. Further, the group was told to hold quarterly checks to ensure it has safeguards in place.

The action has been seen a warning that regulators will take a tough stance against data management failings that could have a detrimental impact on banks’ clients and their business. Charlie Browne, head of market data, quant and risk solutions at data enterprise data management services provider GoldenSource, said the fine shows that there can be no hiding bad practices.

Greater Scrutiny

“Citigroup’s fine should be a warning to other banks and institutions who may have believed their insufficient data and risk controls could fly under the radar,” Browne told Data Management Insight. “It’s time to adapt, or be forced to pay up.”

Financial institution’s data management structures are likely to come under greater regulatory scrutiny to protect customers as more of their activities are digitalised, as artificial intelligence is incorporated into tech systems and amid growing acceptance of crypto finance.

As well as data privacy protection measures, organisations will be expected to tighten controls on many other data domains including trading information and ESG reporting. The fallout from the collapse of Silicon Valley Bank last year will also put pressure on lenders’ solvency requirements and crisis management, processes that are heavily data-dependent.

Data Care

Browne said the penalty imposed on Citigroup showed that institutions had to take greater care with their data and controls models because regulators are very aware of how important digital information is to the efficient running of all parts of an enterprise’s operations.

This fining of Citigroup demonstrates the very real costs associated with banks not being on top of their risk controls and data management,” he said.

“It’s a bold statement from the US rule makers that banks showing complacency about their data issues will be met with regulatory action. Regulators globally are now coming to the understanding that it’s fundamental that financial institutions have effective data management strategies.”

While breaches of Europe’s General Data Protection Regulation (GDPR) and anti-money laundering rules have already been at the root of fines imposed on banks and financial services firms, penalties related to operational use of data are expected to grow.

For example, institutions interviewed by A-Team Group have regularly said they are closely examining the data privacy and IP implications of using outputs from generative AI applications. The concern they have is that the content generated will be in beach of copywriter material on which the model has been trained.

Non-Negotiable

Browne’s comments were echoed by the found and chief executive of Monte Carlo Data Barr Moses, who said that as data needs become central to firms’ operations, “data quality becomes non-negotiable”.

“In 2024 data quality isn’t open for discussion — it’s a clear and present risk and it needs our attention,” Moses wrote on LinkedIn.

Browne said that ensuring compliance will require strenuous efforts by organisations to go deep into their data capabilities and processes.

“Data quality and accessibility are, rightly, front of mind, however, it’s also vital that banks consider concepts like data governance and data lineage when assessing the efficiency of their systems and adequately managing their risk. Being able to track data back to source is an important tool that rule makers are increasingly looking to demand of banks, visible in regulations like the ECB’s Risk Data Aggregation and Risk Reporting (RDARR) measures.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Modern Data Platform Adoption Growth Seen as Benefits Become Apparent: Webinar Review

Take-up of modern data platforms (MDPs) is expected to accelerate in the next few years as financial institutions realise the greater agility, scalability and deeper insights offered by the innovation. Organisations that have so far been relatively slow to adopt the streamlined platforms – because they have been unsure of the technologies’ benefits – will...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...