About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Words of Wisdom

Subscribe to our newsletter

Last week’s JWG event on what should be on data managers’ radars for next year in terms of regulatory change highlighted the significant correlation between a firm’s overall data quality and its risk management function (a subject I have been focused upon for some time). Although the discussion was under Chatham House rules (and hence comments will not be attributed), a lot of interesting points were made, including the importance of the trend towards simultaneous stress testing and its impact on latency within data management systems.

Previously, risk management was a function that largely happened after the fact, but speakers noted that now it is a much more sophisticated function that encompasses new data sets and models. The shift in risk management as a result of developments such as Basel II was from traditionally looking at an asset view of risk to a liability view, which has resulted in the related data story becoming much more challenging.

Panellists agreed that the risk management industry has changed significantly over the last 10 years, whereas before you could get a lot of the data for value at risk (VaR) calculations from front office systems, now data is needed from multiple different systems in order to carry out complex scenario tests. For example, risk systems may need data on turnover from accounting systems.

Moreover, firms can no longer rely on a basic set of risk models and they must now take into account advanced methods such as scenario testing and Monte Carlo simulations in order to meet the requirements of regulators and the business. This, in turn, has increased the importance of the firm’s underlying reference data and its granularity and accuracy. Data validation processes are therefore much higher in importance as a result, as is the requirement for a solid data infrastructure and computational resources (in order to carry out all this number crunching).

Technology is not the answer to a data manager’s woes, however, and there are no quick fixes for data management, agreed panellists, as there are multiple users of the data across an organisation. Hence a ‘one size fits all’ approach to data standards or technology (as some regulators seem keen to introduce) is not desirable. Some of the challenges within the data management function are also related to market activity such as M&A, where legacy infrastructures are layered on top of each other, and these are hard to erase (a theme that every Reference Data Review reader is no doubt well apprised of).

However, changes are afoot, agreed panellists. Historically, data has been seen as an IT problem but now it has risen to an executive level position. Chief data officers (CDOs) are there to take responsibility for fulfilling the requirement for a high level of data quality (even if they are mere scapegoats at some institutions at the moment). In order to make a difference, these CDOs need to take stock of who is using data for what purpose and lead the cultural change for the business to take ownership and responsibility for that data, agreed speakers. Data is therefore not a technical challenge; it is largely a management one.

Although firms need to be wary of regulatory scrutiny going forward, there is likely no desire within the regulatory community to replicate what is going on within firms’ own risk functions. Regulator will therefore not be carrying out risk management modelling to the same level as firms do internally, but instead will seek to be able to drill down into individual reports to the transaction level in order to stress that raw data when there is a potential issue. The intention is likely to conduct independent scenario testing that firms may not want to run themselves, for example by running worst case scenarios.

This will all have a big impact on the middle and back office because firms will need to have in place the infrastructure to support the stress testing of different types of risk at the same time. Simultaneous stress testing is therefore likely to represent a true game changer for the risk function and it will increase the importance of timeliness in the data management context. Real-time back office data could be a reality in future (after all, it has been a common theme at data management conferences throughout the course of this year).

This means the risk management and data challenge will not just be about increased frequency of reporting; it is also in supporting simultaneous risk modelling and providing higher levels of data granularity, all of which brings the issues of latency and accuracy to the fore.

We intend to discuss these issues and many more at our Data Management for Risk and Valuations events in New York and London next year. At the moment, I’m pulling together feedback on our programme topics and any feedback is most welcome.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Proactive RegTech approaches to fighting financial crime

Financial crime is a global problem that costs the economy trillions of dollars a year, despite best efforts by financial services firms, regulators, and governments to stem the flow. As criminals become more sophisticated in how they commit financial crime, so too must capital markets participants working to challenge criminality and secure the global financial...

BLOG

Best Practice Approaches to Trade Surveillance for Market Abuse

Market abuse is a problem, a very big problem for financial institutions that fall on the wrong side of regulation. Penalties include eye-watering fines, reputational damage and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours, a lack of trust and the potential need for significant...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...