The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality Takes Priority Over Speed in the Risk Management Challenge, Says Mizuho’s Tweddle

Share article

Speaking at this week’s A-Team Insight Exchange conference in London, Mizuho International’s risk management chief operating officer Simon Tweddle explained to delegates that although timeliness of data is becoming much more important from a risk management perspective, this must not come at the cost of data quality. “When talking about this area, one must recognise that on demand does not mean real-time and having the right information is much more important than speed of delivery,” he said.

There needs to be a great deal of transparency around the data on which risk management decisions are made, so that firms can articulate to regulators and other parties the limits of their risk calculations and conduct the relevant stress tests, Tweddle explained. It is not just about regulatory drivers, however, firms also want to be able to identify where their business is underperforming and in which areas they are making money, he added.

“Historically, a lot of investment has gone into technology for market risk calculations, but now counterparty risk, issuer risk and collateral management need to be higher up the agenda,” contended Tweddle, who has previously stressed the importance of data quality in the risk management endeavour.

Fellow panellist Amir Halfon, senior director of technology at Oracle Financial Services, added: “Enterprise data management is the dirty little secret at the heart of risk management. A lot of focus has previously been put on the compute function, which is essentially the heavy machinery for risk calculations, but integration of data silos is the new focus in the market.”

There is a question of latency in this endeavour, said Halfon, pointing to the need for much faster access than before to the data that sits within a firm’s data storage infrastructure. He suggested that faster access could be provided to this data via the introduction of traditionally front office focused technology with real-time functionality to the space, such as flash storage and stacking of the application layer.

Indeed, this is not a new concept and some firms have already begun to take advantage of front office technology in a back and middle office data context. See, for example, UBS’ use of Celoxica’s hardware appliance-based data feed handlers to process its market and reference data. By using technology that is traditionally used to power algorithmic trading, the bank was therefore able to reduce the number of PCs on which the reference data is processed from 50 down to one, thus resulting in some rather attractive cost savings.

At the A-Team Insight Exchange event there was a great deal of discussion about the potential of this technology in this context. Halfon in particular was a keen proponent of the innovative use of what he called “architectural best practices” in the data management space. He suggested that loose coupling would be a desirable technology practice and, accordingly, a move from relational databases to distributed data grids in order to be able to keep position and pricing data in memory.

Tweddle agreed that having the ability to keep this data in memory was desirable as it would reduce the latency of dealing with this data from a risk management perspective. Firms would be therefore able to reduce the need for duplicative processes during risk calculations, thus saving time and effort, which could potentially make all the difference in a business context.

Related content

WEBINAR

Recorded Webinar: How to leverage the LIBOR transition to improve your data management game

The transition away from LIBOR (London Interbank Offered Rate) is well underway, but there remains considerable ambiguity around how the final stages will be executed – especially with regards to benchmark replacements in markets outside the UK. What are the options, where are the uncertainties and what stage have firms reached in their preparations? The...

BLOG

Index Data Figures Highly in EU Probe of LSE’s Refinitiv Acquisition

The European Commission has launched an investigation into Refinitiv’s proposed acquisition by the London Stock Exchange under the EU Merger Regulation. High among the EU’s main preliminary concerns are fears that Refinitiv’s strengths in providing data feeds and other inputs into index providers’ calculations could put competitors at a disadvantage have emerged as a key...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual which took place in June 2020 was a huge success with over 1,100 delegates registered. We are currently working on our plans for 2021 and we hope to be back with an in-person event. Whatever the future holds you can guarantee our 2021 event will be back with an exceptional guest speaker line up of Regtech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment. Can't wait until 2021? make sure you sign up to our RegTech Summit Virtual, November 2020. More info...

GUIDE

Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...