About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Words of Wisdom

Subscribe to our newsletter

Last week’s JWG event on what should be on data managers’ radars for next year in terms of regulatory change highlighted the significant correlation between a firm’s overall data quality and its risk management function (a subject I have been focused upon for some time). Although the discussion was under Chatham House rules (and hence comments will not be attributed), a lot of interesting points were made, including the importance of the trend towards simultaneous stress testing and its impact on latency within data management systems.

Previously, risk management was a function that largely happened after the fact, but speakers noted that now it is a much more sophisticated function that encompasses new data sets and models. The shift in risk management as a result of developments such as Basel II was from traditionally looking at an asset view of risk to a liability view, which has resulted in the related data story becoming much more challenging.

Panellists agreed that the risk management industry has changed significantly over the last 10 years, whereas before you could get a lot of the data for value at risk (VaR) calculations from front office systems, now data is needed from multiple different systems in order to carry out complex scenario tests. For example, risk systems may need data on turnover from accounting systems.

Moreover, firms can no longer rely on a basic set of risk models and they must now take into account advanced methods such as scenario testing and Monte Carlo simulations in order to meet the requirements of regulators and the business. This, in turn, has increased the importance of the firm’s underlying reference data and its granularity and accuracy. Data validation processes are therefore much higher in importance as a result, as is the requirement for a solid data infrastructure and computational resources (in order to carry out all this number crunching).

Technology is not the answer to a data manager’s woes, however, and there are no quick fixes for data management, agreed panellists, as there are multiple users of the data across an organisation. Hence a ‘one size fits all’ approach to data standards or technology (as some regulators seem keen to introduce) is not desirable. Some of the challenges within the data management function are also related to market activity such as M&A, where legacy infrastructures are layered on top of each other, and these are hard to erase (a theme that every Reference Data Review reader is no doubt well apprised of).

However, changes are afoot, agreed panellists. Historically, data has been seen as an IT problem but now it has risen to an executive level position. Chief data officers (CDOs) are there to take responsibility for fulfilling the requirement for a high level of data quality (even if they are mere scapegoats at some institutions at the moment). In order to make a difference, these CDOs need to take stock of who is using data for what purpose and lead the cultural change for the business to take ownership and responsibility for that data, agreed speakers. Data is therefore not a technical challenge; it is largely a management one.

Although firms need to be wary of regulatory scrutiny going forward, there is likely no desire within the regulatory community to replicate what is going on within firms’ own risk functions. Regulator will therefore not be carrying out risk management modelling to the same level as firms do internally, but instead will seek to be able to drill down into individual reports to the transaction level in order to stress that raw data when there is a potential issue. The intention is likely to conduct independent scenario testing that firms may not want to run themselves, for example by running worst case scenarios.

This will all have a big impact on the middle and back office because firms will need to have in place the infrastructure to support the stress testing of different types of risk at the same time. Simultaneous stress testing is therefore likely to represent a true game changer for the risk function and it will increase the importance of timeliness in the data management context. Real-time back office data could be a reality in future (after all, it has been a common theme at data management conferences throughout the course of this year).

This means the risk management and data challenge will not just be about increased frequency of reporting; it is also in supporting simultaneous risk modelling and providing higher levels of data granularity, all of which brings the issues of latency and accuracy to the fore.

We intend to discuss these issues and many more at our Data Management for Risk and Valuations events in New York and London next year. At the moment, I’m pulling together feedback on our programme topics and any feedback is most welcome.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Cross-Regulation Data Consistency and Accuracy

Regulatory reporting obligations continue to expand, bringing with them more overlaps of data elements across reporting regimes. As many firms struggle with the accuracy and completeness of individual reporting obligations, regulators have increasingly begun to focus on cross-regulation data consistency in their data validations and examination processes. This webinar will identify cases of data overlap...

BLOG

Preparing for a Period of Regulatory Change

Murray Campbell, Business Consultant at AutoRek, considers how the compliance regime for UK financial services firms is changing and how outsourcing can help firms manage the regulatory burden. In recent years, the UK financial services industry has found itself with an opportunity to redefine the compliance landscape. Brexit has allowed the UK to break away...

EVENT

Data Management Summit New York City

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...