The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes.
The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the international banking group after it was found to have put in place insufficient data management risk controls. Further, the group was told to hold quarterly checks to ensure it has safeguards in place.
The action has been seen a warning that regulators will take a tough stance against data management failings that could have a detrimental impact on banks’ clients and their business. Charlie Browne, head of market data, quant and risk solutions at data enterprise data management services provider GoldenSource, said the fine shows that there can be no hiding bad practices.
Greater Scrutiny
“Citigroup’s fine should be a warning to other banks and institutions who may have believed their insufficient data and risk controls could fly under the radar,” Browne told Data Management Insight. “It’s time to adapt, or be forced to pay up.”
Financial institution’s data management structures are likely to come under greater regulatory scrutiny to protect customers as more of their activities are digitalised, as artificial intelligence is incorporated into tech systems and amid growing acceptance of crypto finance.
As well as data privacy protection measures, organisations will be expected to tighten controls on many other data domains including trading information and ESG reporting. The fallout from the collapse of Silicon Valley Bank last year will also put pressure on lenders’ solvency requirements and crisis management, processes that are heavily data-dependent.
Data Care
Browne said the penalty imposed on Citigroup showed that institutions had to take greater care with their data and controls models because regulators are very aware of how important digital information is to the efficient running of all parts of an enterprise’s operations.
This fining of Citigroup demonstrates the very real costs associated with banks not being on top of their risk controls and data management,” he said.
“It’s a bold statement from the US rule makers that banks showing complacency about their data issues will be met with regulatory action. Regulators globally are now coming to the understanding that it’s fundamental that financial institutions have effective data management strategies.”
While breaches of Europe’s General Data Protection Regulation (GDPR) and anti-money laundering rules have already been at the root of fines imposed on banks and financial services firms, penalties related to operational use of data are expected to grow.
For example, institutions interviewed by A-Team Group have regularly said they are closely examining the data privacy and IP implications of using outputs from generative AI applications. The concern they have is that the content generated will be in beach of copywriter material on which the model has been trained.
Non-Negotiable
Browne’s comments were echoed by the found and chief executive of Monte Carlo Data Barr Moses, who said that as data needs become central to firms’ operations, “data quality becomes non-negotiable”.
“In 2024 data quality isn’t open for discussion — it’s a clear and present risk and it needs our attention,” Moses wrote on LinkedIn.
Browne said that ensuring compliance will require strenuous efforts by organisations to go deep into their data capabilities and processes.
“Data quality and accessibility are, rightly, front of mind, however, it’s also vital that banks consider concepts like data governance and data lineage when assessing the efficiency of their systems and adequately managing their risk. Being able to track data back to source is an important tool that rule makers are increasingly looking to demand of banks, visible in regulations like the ECB’s Risk Data Aggregation and Risk Reporting (RDARR) measures.”
Subscribe to our newsletter