About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Words of Wisdom

Subscribe to our newsletter

Last week’s JWG event on what should be on data managers’ radars for next year in terms of regulatory change highlighted the significant correlation between a firm’s overall data quality and its risk management function (a subject I have been focused upon for some time). Although the discussion was under Chatham House rules (and hence comments will not be attributed), a lot of interesting points were made, including the importance of the trend towards simultaneous stress testing and its impact on latency within data management systems.

Previously, risk management was a function that largely happened after the fact, but speakers noted that now it is a much more sophisticated function that encompasses new data sets and models. The shift in risk management as a result of developments such as Basel II was from traditionally looking at an asset view of risk to a liability view, which has resulted in the related data story becoming much more challenging.

Panellists agreed that the risk management industry has changed significantly over the last 10 years, whereas before you could get a lot of the data for value at risk (VaR) calculations from front office systems, now data is needed from multiple different systems in order to carry out complex scenario tests. For example, risk systems may need data on turnover from accounting systems.

Moreover, firms can no longer rely on a basic set of risk models and they must now take into account advanced methods such as scenario testing and Monte Carlo simulations in order to meet the requirements of regulators and the business. This, in turn, has increased the importance of the firm’s underlying reference data and its granularity and accuracy. Data validation processes are therefore much higher in importance as a result, as is the requirement for a solid data infrastructure and computational resources (in order to carry out all this number crunching).

Technology is not the answer to a data manager’s woes, however, and there are no quick fixes for data management, agreed panellists, as there are multiple users of the data across an organisation. Hence a ‘one size fits all’ approach to data standards or technology (as some regulators seem keen to introduce) is not desirable. Some of the challenges within the data management function are also related to market activity such as M&A, where legacy infrastructures are layered on top of each other, and these are hard to erase (a theme that every Reference Data Review reader is no doubt well apprised of).

However, changes are afoot, agreed panellists. Historically, data has been seen as an IT problem but now it has risen to an executive level position. Chief data officers (CDOs) are there to take responsibility for fulfilling the requirement for a high level of data quality (even if they are mere scapegoats at some institutions at the moment). In order to make a difference, these CDOs need to take stock of who is using data for what purpose and lead the cultural change for the business to take ownership and responsibility for that data, agreed speakers. Data is therefore not a technical challenge; it is largely a management one.

Although firms need to be wary of regulatory scrutiny going forward, there is likely no desire within the regulatory community to replicate what is going on within firms’ own risk functions. Regulator will therefore not be carrying out risk management modelling to the same level as firms do internally, but instead will seek to be able to drill down into individual reports to the transaction level in order to stress that raw data when there is a potential issue. The intention is likely to conduct independent scenario testing that firms may not want to run themselves, for example by running worst case scenarios.

This will all have a big impact on the middle and back office because firms will need to have in place the infrastructure to support the stress testing of different types of risk at the same time. Simultaneous stress testing is therefore likely to represent a true game changer for the risk function and it will increase the importance of timeliness in the data management context. Real-time back office data could be a reality in future (after all, it has been a common theme at data management conferences throughout the course of this year).

This means the risk management and data challenge will not just be about increased frequency of reporting; it is also in supporting simultaneous risk modelling and providing higher levels of data granularity, all of which brings the issues of latency and accuracy to the fore.

We intend to discuss these issues and many more at our Data Management for Risk and Valuations events in New York and London next year. At the moment, I’m pulling together feedback on our programme topics and any feedback is most welcome.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best approaches for trade and transaction reporting

Compliance practitioners and technology leaders in capital markets face mounting pressure to ensure that reporting processes are efficient, accurate, and aligned with global standards. Market developments and jurisdictional nuances in regulatory frameworks like MiFID II, EMIR, SFTR and MAS create a continual challenge for compliance teams. This webinar brings together senior RegTech executives and seasoned...

BLOG

FCA Takes Charge: UK Centralises AML Supervision Across Professional Services

The United Kingdom’s decision to centralise Anti-Money Laundering (AML) and Counter-Terrorism Financing (CTF) supervision under the Financial Conduct Authority (FCA) marks a structural shift that brings professional services oversight in line with the rest of the financial sector. The move aligns the UK with a broader global trend toward consolidation, consistency, and intelligence-led supervision –...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...