About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Group Webinar Discusses Regulation and Risk as Data Management Drivers

Subscribe to our newsletter

Increasing regulation of financial markets, including current regulations such as Dodd Frank, EMIR and Fatca, and forthcoming regulations such as Solvency II and BCBS 239, is forcing many firms to change their data management processes in order to achieve compliance. Offering information and advice to data management practitioners tackling the demands of regulation, this week’s A-Team Group webinar, Regulation and Risk as Data Management Drivers, considered the requirements of a number of regulations and how firms might organise data management infrastructure to comply with them.

Sarah Underwood, A-Team Group editor, moderated the webinar and set the scene, noting the ongoing data management demands of current regulations and the more extensive risk aggregation and reporting requirements of upcoming regulations. With a lack of clarity in some forthcoming rules and the growing need for guidance in this area, she questioned how firms will be ready to comply with some of these regulations as they take effect.

Tim Lind, head of Pricing and Reference Services at Thomson Reuters, described the increasing difficulty of handling the sheer volume of rules and regulations that are in the market, especially given the frequent lack of clarity in regulatory documentation. He said: “The challenge we have in the data management profession right now is to cut through the jargon and platitudes. There are a million new acronyms and a million new rules that cause the re-evaluation of securities records and the onboarding of more data. With each disclosure rule, the securities in a client record grow in terms of fields and attributes. These are some of the high-level tasks we are facing now, drilling into the details, the definitions and the rules.”

After assessing the scale of regulation, Underwood turned attention to the specific challenges associated with Basel III and BCBS 239. Marty Williams, vice president of Reference Data Product Development at Interactive Data, explained that regulations like these provide a blueprint for firms to follow and compare with internal processes to ensure they are on the right track. He added: “BCBS 239 in particular puts senior management in the game by saying it needs to have oversight of how data aggregation and IT systems are set up and operated, and that it must be able to ensure it can provide the information regulators require.”

Concurring with Williams, Cristiano Zazzara, senior director – global head of Portfolio Risk Solutions and EMEA head of Application Specialists at S&P Capital IQ, said more guidance along the lines of Basel Committee documents would be welcome in future to support regulatory compliance programmes. He also noted that the set of principles for effective risk data management set down in BCBS 239 make a great template for what a risk framework should look like and set out what regulators are expecting firms to deliver.

Expanding on how to manage data for BCBS 239, Lind explained that centralised oversight and deciding who is responsible for data quality are going to be essential to ensure effective data governance. In order to ensure that these risk management efforts are effective for both management and regulators, he added that firms will need to implement an iterative process in which reporting can be tested for accuracy, completeness, timeliness and overall usefulness. He said: “This is an evolution in the fundamental strategy that has been around

in the data management profession for about 20 years now. Core to the data governance function is supervisory oversight, and that defines who owns the data, how it is remediated and ultimately how different business functions and regions will cooperate to break down silos so that data can be aggregated. Of course, it is easier said than done, it is a huge challenge.”

Returning to regulations that are already in effect, Underwood asked the webinar participants how these regulations will effect data management in the long term. While the participants agreed that the interplay of various regulations adds another layer of complexity and can increase related data technology costs, Zazzara stressed that these regulations should be seen as improving business infrastructure rather than being see as an obstacle to overcome. He explained: “ The reference data system is essential for all companies, not only for regulatory requirements, but also for business practices. Complying with regulations will advance internal business processes and make them more profitable. This is a very relevant point, complying with regulation needs to be seen as value creation and not just as a regulatory burden.”

Considering sector specific regulations, particularly AIFMD and Solvency II, and how firms can tackle their data management challenges, the participants agreed that understanding the terminology and looking for common features across the regulations is essential to obtaining the data required to make classifications and disclosures in line with the regulations’ requirements. Williams commented: “While these are specific regulations, they can and should come back to a firm’s overarching principles and operating framework. There are common threads in a lot of these regulations, like how we will identify an entity or security, and how we will link that information across an entire database so that we can report on a macro level to regulators.”

Concluding the webinar with a discussion on the best steps to take moving forward, Underwood asked participants what advice they would give to companies looking to improve their data management operations. The participants said regulation will continue to drive demand for a variety of risk, compliance and data management solutions, and that firms might want to consider outsourcing costs if they are struggling to meet regulatory obligations.

Williams said firms need to have a keen understanding of what regulators are trying to achieve and get a clear understanding of their infrastructure and exposures rather than getting lost in thousands of pages of new regulations and their associated burden.

Lind concluded: “There is always a common framework and process to address new rules. For instance, there will always be a supervisory review and remedial actions, IT infrastructure and governance decisions, a reporting practice, and risk data aggregation that requires data to be accurate, complete and timely. Looking at these four components, regardless of new regulation, this is where you need to be making your decisions.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best Practices for Managing Trade Surveillance

1 July 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The surge in trading volumes combined with the emergence of new digital financial assets and geopolitical events have added layers of complexity to market activities. Traditional surveillance methods often struggle to keep pace with these changes, leading to difficulties in detecting...

BLOG

Scalability the Keyword Behind S&P Global’s Enriched iLEVEL

S&P Global Market Intelligence’s update to its iLEVEL private markets data tool has been designed to enable firms to scale their engagements in private markets. The financial data company is betting that financial institutions’ growing engagement in these markets is such that they will need the sort of data provisions associated with public markets. S&P...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...