The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Group Webinar Discusses Regulation and Risk as Data Management Drivers

Increasing regulation of financial markets, including current regulations such as Dodd Frank, EMIR and Fatca, and forthcoming regulations such as Solvency II and BCBS 239, is forcing many firms to change their data management processes in order to achieve compliance. Offering information and advice to data management practitioners tackling the demands of regulation, this week’s A-Team Group webinar, Regulation and Risk as Data Management Drivers, considered the requirements of a number of regulations and how firms might organise data management infrastructure to comply with them.

Sarah Underwood, A-Team Group editor, moderated the webinar and set the scene, noting the ongoing data management demands of current regulations and the more extensive risk aggregation and reporting requirements of upcoming regulations. With a lack of clarity in some forthcoming rules and the growing need for guidance in this area, she questioned how firms will be ready to comply with some of these regulations as they take effect.

Tim Lind, head of Pricing and Reference Services at Thomson Reuters, described the increasing difficulty of handling the sheer volume of rules and regulations that are in the market, especially given the frequent lack of clarity in regulatory documentation. He said: “The challenge we have in the data management profession right now is to cut through the jargon and platitudes. There are a million new acronyms and a million new rules that cause the re-evaluation of securities records and the onboarding of more data. With each disclosure rule, the securities in a client record grow in terms of fields and attributes. These are some of the high-level tasks we are facing now, drilling into the details, the definitions and the rules.”

After assessing the scale of regulation, Underwood turned attention to the specific challenges associated with Basel III and BCBS 239. Marty Williams, vice president of Reference Data Product Development at Interactive Data, explained that regulations like these provide a blueprint for firms to follow and compare with internal processes to ensure they are on the right track. He added: “BCBS 239 in particular puts senior management in the game by saying it needs to have oversight of how data aggregation and IT systems are set up and operated, and that it must be able to ensure it can provide the information regulators require.”

Concurring with Williams, Cristiano Zazzara, senior director – global head of Portfolio Risk Solutions and EMEA head of Application Specialists at S&P Capital IQ, said more guidance along the lines of Basel Committee documents would be welcome in future to support regulatory compliance programmes. He also noted that the set of principles for effective risk data management set down in BCBS 239 make a great template for what a risk framework should look like and set out what regulators are expecting firms to deliver.

Expanding on how to manage data for BCBS 239, Lind explained that centralised oversight and deciding who is responsible for data quality are going to be essential to ensure effective data governance. In order to ensure that these risk management efforts are effective for both management and regulators, he added that firms will need to implement an iterative process in which reporting can be tested for accuracy, completeness, timeliness and overall usefulness. He said: “This is an evolution in the fundamental strategy that has been around

in the data management profession for about 20 years now. Core to the data governance function is supervisory oversight, and that defines who owns the data, how it is remediated and ultimately how different business functions and regions will cooperate to break down silos so that data can be aggregated. Of course, it is easier said than done, it is a huge challenge.”

Returning to regulations that are already in effect, Underwood asked the webinar participants how these regulations will effect data management in the long term. While the participants agreed that the interplay of various regulations adds another layer of complexity and can increase related data technology costs, Zazzara stressed that these regulations should be seen as improving business infrastructure rather than being see as an obstacle to overcome. He explained: “ The reference data system is essential for all companies, not only for regulatory requirements, but also for business practices. Complying with regulations will advance internal business processes and make them more profitable. This is a very relevant point, complying with regulation needs to be seen as value creation and not just as a regulatory burden.”

Considering sector specific regulations, particularly AIFMD and Solvency II, and how firms can tackle their data management challenges, the participants agreed that understanding the terminology and looking for common features across the regulations is essential to obtaining the data required to make classifications and disclosures in line with the regulations’ requirements. Williams commented: “While these are specific regulations, they can and should come back to a firm’s overarching principles and operating framework. There are common threads in a lot of these regulations, like how we will identify an entity or security, and how we will link that information across an entire database so that we can report on a macro level to regulators.”

Concluding the webinar with a discussion on the best steps to take moving forward, Underwood asked participants what advice they would give to companies looking to improve their data management operations. The participants said regulation will continue to drive demand for a variety of risk, compliance and data management solutions, and that firms might want to consider outsourcing costs if they are struggling to meet regulatory obligations.

Williams said firms need to have a keen understanding of what regulators are trying to achieve and get a clear understanding of their infrastructure and exposures rather than getting lost in thousands of pages of new regulations and their associated burden.

Lind concluded: “There is always a common framework and process to address new rules. For instance, there will always be a supervisory review and remedial actions, IT infrastructure and governance decisions, a reporting practice, and risk data aggregation that requires data to be accurate, complete and timely. Looking at these four components, regardless of new regulation, this is where you need to be making your decisions.”

Related content

WEBINAR

Recorded Webinar: The UK’s New Prudential Regime for Investment Firms – Time to Prepare!

With the implementation of the new Investment Firms Prudential Regime (IFPR), the FCA is aiming to streamline and simplify the prudential requirements for solo-regulated investment firms in the UK. Under the new regime, all MiFID authorized, Collective Portfolio Management Investment Firms (i.e. UK UCITS ManCo and Alternative Investment Fund Management Firms permitted to undertake Additional Activities)...

BLOG

The UK Regulatory Regime after Brexit – What Comes Next?

By Martin Lovick, Director, and Bobby Johal, Managing Director, ACA Compliance Group. A regime in transition Investment managers are – quite rightly – focusing near-term on the cliff-edge nature of the UK/EU negotiations on future trade arrangements. Their contingency planning will have already considered the likely loss of passporting rights for UK firms exporting their...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...