About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality Takes Priority Over Speed in the Risk Management Challenge, Says Mizuho’s Tweddle

Subscribe to our newsletter

Speaking at this week’s A-Team Insight Exchange conference in London, Mizuho International’s risk management chief operating officer Simon Tweddle explained to delegates that although timeliness of data is becoming much more important from a risk management perspective, this must not come at the cost of data quality. “When talking about this area, one must recognise that on demand does not mean real-time and having the right information is much more important than speed of delivery,” he said.

There needs to be a great deal of transparency around the data on which risk management decisions are made, so that firms can articulate to regulators and other parties the limits of their risk calculations and conduct the relevant stress tests, Tweddle explained. It is not just about regulatory drivers, however, firms also want to be able to identify where their business is underperforming and in which areas they are making money, he added.

“Historically, a lot of investment has gone into technology for market risk calculations, but now counterparty risk, issuer risk and collateral management need to be higher up the agenda,” contended Tweddle, who has previously stressed the importance of data quality in the risk management endeavour.

Fellow panellist Amir Halfon, senior director of technology at Oracle Financial Services, added: “Enterprise data management is the dirty little secret at the heart of risk management. A lot of focus has previously been put on the compute function, which is essentially the heavy machinery for risk calculations, but integration of data silos is the new focus in the market.”

There is a question of latency in this endeavour, said Halfon, pointing to the need for much faster access than before to the data that sits within a firm’s data storage infrastructure. He suggested that faster access could be provided to this data via the introduction of traditionally front office focused technology with real-time functionality to the space, such as flash storage and stacking of the application layer.

Indeed, this is not a new concept and some firms have already begun to take advantage of front office technology in a back and middle office data context. See, for example, UBS’ use of Celoxica’s hardware appliance-based data feed handlers to process its market and reference data. By using technology that is traditionally used to power algorithmic trading, the bank was therefore able to reduce the number of PCs on which the reference data is processed from 50 down to one, thus resulting in some rather attractive cost savings.

At the A-Team Insight Exchange event there was a great deal of discussion about the potential of this technology in this context. Halfon in particular was a keen proponent of the innovative use of what he called “architectural best practices” in the data management space. He suggested that loose coupling would be a desirable technology practice and, accordingly, a move from relational databases to distributed data grids in order to be able to keep position and pricing data in memory.

Tweddle agreed that having the ability to keep this data in memory was desirable as it would reduce the latency of dealing with this data from a risk management perspective. Firms would be therefore able to reduce the need for duplicative processes during risk calculations, thus saving time and effort, which could potentially make all the difference in a business context.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Accelex Says its AI Agent Can Tame The ‘Nastiest’ Corner of Data Management

Unstructured data is the bugbear of private market data managers, but one of the latest entrant to the space claims to have a solution that can tame the “nastiest” part of data retrieval – private market fund quarterly reports. Accelex is a London-based artificial intelligence (AI) focussed FinTech that has launched a platform that can prise market-relevant...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...