The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG’s Di Giammarino Talks up Data Practicalities of Risk and Regulatory Onslaught

Share article

The long list of incoming regulations, especially those that will require the reinvention the risk management wheel, have thrown into the spotlight the need for firms to get a better handle on their operational risk, said PJ Di Giammarino, CEO of think tank JWG at the recent Thomson Reuters pricing event in London. Reiterating his mantra about the need to understand “what good looks like” in terms of data standards, Di Giammarino elaborated upon the requirement for a common framework for exposure management.

“A data quality measure is needed in order to be able to improve risk models over time and to prove to regulators that assessments submitted are correct,” he told attendees to the event last week.

With such scrutiny on the cards and in the wake of the credit crunch, firms are beginning to invest in tackling some of their underlying data transparency issues. According to an interactive poll at the event, 56% of attendees said they had seen moderate steps forward being made within their firms with regards to data transparency, and a further 16% had seen significant progress. An unlucky 17% indicated that they believe regulation will not prove to be a catalyst, but 11% were awaiting their projects to get underway.

Di Giammarino’s fellow panellists at the Thomson Reuters event indicated that they believe risk management is having even more of an impact on budgets than regulatory change. The desire to reduce operational risk overall is driving firms to invest in getting a better handle on their data in order to be able to actively set their strategies for managing risk and risk tolerances, they agreed.

Previously, firms have been held back by issues of cost but now senior management buy in is much more forthcoming due to the regulatory and client focus on the quantification of risk management. As noted by Mizuho International’s risk management chief operating officer Simon Tweddle, firms need to produce numbers to prove they have been engaged in practices such as stress testing. This means silos must be broken down in order to gather together the multiple sources of relevant data from across the business.

Simon Trewin, independent consultant, added that the desire for internal transparency is one of the biggest drivers for data management investment in many banks. The shock of Lehman has left many firms aware of their internal risk management shortcomings and even the front office is taking an active interest in risk management practices. “The front office wants more visibility around the management of capital and counterparty risk exposure in order to make smarter decisions,” sais Trewin.

Mizuho’s Tweddle added that the tension between timeliness and accuracy with regards to data is also a challenge facing financial institutions. He recommended the setting of tolerances and metrics in order to monitor data quality overall and the need for a more detailed management information system (MIS) with established data attributes.

Of course, this desire for more data on the part of the front office means that data management within downstream systems is more important than ever before. Consistent use of data standards across an organisation is therefore integral to the smooth operation of risk management systems.

However, the lack of standardisation across the industry is a key sticking point in this endeavour, said Di Giammarino. “The ISO process is a starting point and it is open but the industry needs to figure out the standards that are fit for purpose first,” he explained.

JWG has been speaking to the Committee of European Securities Regulators (CESR) in order to begin the standardisation process, he continued. The group has spent a month looking at address formats and is now examining naming conventions.

Despite this regulatory involvement, Tweddle warned that firms should not wait for a policy statement before they take action to deal with data quality. Especially given the regulatory community’s bent towards using the stick of late.

Related content

WEBINAR

Recorded Webinar: How to run effective client onboarding and KYC processes

Increasing cost, complexity and regulatory change continue to challenge firms implementing client onboarding and Know Your Customer (KYC) systems. With an effective strategy and a clearly defined pathway, it’s possible to gain a valuable competitive advantage whilst meeting those all-important compliance requirements. But how to get there? With a myriad of different options out there...

BLOG

Mphasis Partners with iMeta on AI-Driven KYC Model

Cloud and cognitive specialist Mphasis has launched a brand new strategic partnership with the UK’s iMeta Technologies, a provider of onboarding, client lifecycle and master data management software and services, to deliver a next generation onboarding model for the UK and Europe that aims to disrupt the traditional KYC market through the use of AI....

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.