About a-team Marketing Services

A-Team Insight Blogs

US Regulatory Reporting: The Data Management Response

Subscribe to our newsletter

Given the rapid pace of change in the US regulatory reporting landscape in recent years, the banking sector is increasingly required to focus on its data management capabilities in order to meet these requirements effectively. At A-Team Group’s Data Management Summit USA virtual conference last month, Kenneth Lamar, Principal Partner at Lamar Associates, moderated a panel of leading practitioners in this space to discuss both the evolving nature of the data management challenge while also seeking to offer some practical solutions for addressing the growing regulatory reporting burden. Sharing their insights on the panel were: Malavika Solanki, Member of the Management Team, Derivatives Service Bureau (DSB); Ling Yin, Managing Director, Regulatory Reporting Policy & Implementation at JP Morgan Chase; Joshua Beaton, Head of Americas Trade & Transaction Reporting at Morgan Stanley; Scott Preiss, Global Head and Managing Director, CUSIP Global Services and Matt Helmrath, Data Governance Consulting Manager at OneTrust.

Opening the discussion, one panellist outlined what they believe are the four key challenges and opportunities that exist for data management. The first and foremost challenge, in their opinion, was that since the 2008 Financial Crisis there has been a substantial increase in the scope, complexity and number of new reports. To highlight the scale of these changes, they explained that prior to 2008, routine banking reports were estimated to take about 500,000 hours for the industry to complete on an annual basis. Since then, there have been a number of new reports added, including the CCAR stress testing report, the GSIB report and the FR 2052a.

“All these reports have increased the reporting burden to the banking industry by about two million hours annually,” they said, adding that this estimate was made by the Federal Reserve and is actually a significant under estimate of the hours involved. “The reason for the underestimate is it did not really take into consideration the size and complexity of the banks, while based on our conversations with the industry, the actual number of hours involved is around five to ten times higher.”

The second key challenge highlighted was the frequency with which existing rules and regulations change, while the third was that many of the newer reports are very granular in nature, which means that the data management process also needs to be at that granular level to ensure the data is fit for purpose. Last but not least, they explained that there is challenge, but also opportunity, in the higher expectations of data quality.

Asked by Lamar to list the top one or two regulatory requirements or expectations that they considered to be a priority in terms of impact in 2021, one panellist said they weren’t sure that there were any more impactful regulatory expectations than in the area of climate specific risks and ESG. “Many of course see ESG moving away from a ‘do good’ activity to a key driver of risk value and opportunity,” they said. “The disruptions that affected all of us in 2020 and 2021 will forever reshape the financial industry and today, financial services regulators have the jurisdictional authority needed to set forth supervisory expectations for addressing financial climate related risks and also ESG risks more generally.”

Lamar noted questions coming in from attendees asking the panel the best way of mapping data back to sources, so the complete end-to-end process, and how to best go about achieving that in a complex organization. One of the leading experts answered that they are seeing a lot of organisations now starting to think about a data acquisition and data disposition framework relevant to their business. They added that there is also a continued volume and variance in the information that’s being ingested by organizations that companies aren’t used to. This results in institutions needing to find an automated way of being able to constantly classify and attest to what data exists in those environments against the specific control frameworks used to guide the business.

“It’s not a single effort anymore, it’s not siloed, there are several different people that need to be involved,” the panellist said. “Ultimately, the customers that we work with tend to see using artificial intelligence and language processing technologies as a way to take manual assessments and manual efforts out of the equation and start to effectively answer data driven questions with data driven technology.” In addition, banks also need to find a way to do this that reduces tension on their business and which creates an effective path towards quality data lineage.

Lamar said that he would also go one step further and mention the need for cultural change at the end of the process. Another panellist agreed, adding that they had liked the audience question a lot because in their opinion they were really trying to ask why, once they’ve developed the infrastructure and framework, they are still constantly having problems when the regulators look at the reports they are producing. The panellist warned that as a company, if you want to have that end-to-end data quality, then you have to also consider your investment in training and quality assurance. “The reason is because data is only as good as the person who enters it into the system,” they added.

Turning to the results of the audience poll, Lamar noted that when asked what they considered to be the biggest data management challenge, then legacy systems and processes were by far the biggest concern for those attending. A further panellist said they were not surprised by that result, adding that the undercurrent to all these data management discussions is how to find a new way of doing things, or at least a revised way of doing things. The legacy systems and processes are no longer fit for purpose, they said, which also results in more manual processes which also ranked highly as a challenge in the results of the audience poll.

The industry expert noted that it was interesting that poor data quality was voted as a key challenge by only 8% of attendees, but said they expected this to rise over time as the issue of unknown unknowns in terms of poor data quality comes into focus. “As systems get renovated, new processes built and new controls are specifically created around those data elements which we now need to report or will need to report, we will become increasingly aware of just how good or how poor our data quality is. And then we’ll need to take steps to rectify that,” they said.

Turning to the issue of how to ensure data quality is maintained consistently across regulatory reporting functions, another panellist said the notion of data at source is particularly interesting, for two key reasons. The first consideration, they explained, is that once a business is plugged into a data source, it needs to be able to keep pace with the regulatory requirements or evolving market needs as the source evolves. They added that businesses are increasingly likely to also be plugged in to the communication and information coming from that source, which could be significant given the current environment. The second piece around consistency is the various topics related to that, the panellist said. “It is about making sure that you’re thinking about data quality both at source but also in terms of data alignment,” they added.

Going back to the poll results, the panellist warned that many firms were attempting to do their regulatory reporting from the bottom end of their workflows. Yet because the need for data points is evolving to require elements that are normally only captured at the point of trade, firms that have evolved their systems to get to a pre-trade point of contact are much better positioned, they said.

To round out the panel, one of the industry experts noted that the key question should really be: how can all these data management issues be addressed? There is a simple answer to that, they said, but it will not prove popular, because the answer will always ultimately be an investment in both time and money. “These are not simple questions,” they added. “These are cultural issues, these are complex, systemic problems within firms that are really, really, really challenging to get your head around. And I certainly feel that if you don’t spend the money now to get it right, you’re going to spend the money later.” In addition, they warned about the recent high profile run of regulatory fines that have been handed out by the CFTC to leading institutions for reporting failings, with many of the fines involving very significant sums. “Our regulators are very active in this space and it’s a worthwhile, but painful, investment upfront so that you don’t suffer those consequences later,” they concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Apiax Releases AI Policy Assistant

Apiax, a Zurich-based provider of embedded compliance products, has released a solution designed to improve operational efficiency and accuracy of policy searches. The solution is a synthesis of Apiax’s compliance expertise, rule-based technology, and Generative AI, and allows financial institutions to access and verify company policy details with ease. The company’s so-called AI Policy Assistant...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...