The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality Takes Priority Over Speed in the Risk Management Challenge, Says Mizuho’s Tweddle

Speaking at this week’s A-Team Insight Exchange conference in London, Mizuho International’s risk management chief operating officer Simon Tweddle explained to delegates that although timeliness of data is becoming much more important from a risk management perspective, this must not come at the cost of data quality. “When talking about this area, one must recognise that on demand does not mean real-time and having the right information is much more important than speed of delivery,” he said.

There needs to be a great deal of transparency around the data on which risk management decisions are made, so that firms can articulate to regulators and other parties the limits of their risk calculations and conduct the relevant stress tests, Tweddle explained. It is not just about regulatory drivers, however, firms also want to be able to identify where their business is underperforming and in which areas they are making money, he added.

“Historically, a lot of investment has gone into technology for market risk calculations, but now counterparty risk, issuer risk and collateral management need to be higher up the agenda,” contended Tweddle, who has previously stressed the importance of data quality in the risk management endeavour.

Fellow panellist Amir Halfon, senior director of technology at Oracle Financial Services, added: “Enterprise data management is the dirty little secret at the heart of risk management. A lot of focus has previously been put on the compute function, which is essentially the heavy machinery for risk calculations, but integration of data silos is the new focus in the market.”

There is a question of latency in this endeavour, said Halfon, pointing to the need for much faster access than before to the data that sits within a firm’s data storage infrastructure. He suggested that faster access could be provided to this data via the introduction of traditionally front office focused technology with real-time functionality to the space, such as flash storage and stacking of the application layer.

Indeed, this is not a new concept and some firms have already begun to take advantage of front office technology in a back and middle office data context. See, for example, UBS’ use of Celoxica’s hardware appliance-based data feed handlers to process its market and reference data. By using technology that is traditionally used to power algorithmic trading, the bank was therefore able to reduce the number of PCs on which the reference data is processed from 50 down to one, thus resulting in some rather attractive cost savings.

At the A-Team Insight Exchange event there was a great deal of discussion about the potential of this technology in this context. Halfon in particular was a keen proponent of the innovative use of what he called “architectural best practices” in the data management space. He suggested that loose coupling would be a desirable technology practice and, accordingly, a move from relational databases to distributed data grids in order to be able to keep position and pricing data in memory.

Tweddle agreed that having the ability to keep this data in memory was desirable as it would reduce the latency of dealing with this data from a risk management perspective. Firms would be therefore able to reduce the need for duplicative processes during risk calculations, thus saving time and effort, which could potentially make all the difference in a business context.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

GLEIF Moves LEI Uses Cases on from Regulatory Compliance to Digital Identity Products

The Global Legal Entity Identifier Foundation (GLEIF) is extending use cases of the LEI beyond regulatory reporting to solutions initially including digital certificates. The first commercial demonstration of LEIs embedded within digital certificates has been made by the China Financial Certification Authority (CFCA), which has also signed up as a validation agent within the global...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...