The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG’s Di Giammarino Talks up Data Practicalities of Risk and Regulatory Onslaught

The long list of incoming regulations, especially those that will require the reinvention the risk management wheel, have thrown into the spotlight the need for firms to get a better handle on their operational risk, said PJ Di Giammarino, CEO of think tank JWG at the recent Thomson Reuters pricing event in London. Reiterating his mantra about the need to understand “what good looks like” in terms of data standards, Di Giammarino elaborated upon the requirement for a common framework for exposure management.

“A data quality measure is needed in order to be able to improve risk models over time and to prove to regulators that assessments submitted are correct,” he told attendees to the event last week.

With such scrutiny on the cards and in the wake of the credit crunch, firms are beginning to invest in tackling some of their underlying data transparency issues. According to an interactive poll at the event, 56% of attendees said they had seen moderate steps forward being made within their firms with regards to data transparency, and a further 16% had seen significant progress. An unlucky 17% indicated that they believe regulation will not prove to be a catalyst, but 11% were awaiting their projects to get underway.

Di Giammarino’s fellow panellists at the Thomson Reuters event indicated that they believe risk management is having even more of an impact on budgets than regulatory change. The desire to reduce operational risk overall is driving firms to invest in getting a better handle on their data in order to be able to actively set their strategies for managing risk and risk tolerances, they agreed.

Previously, firms have been held back by issues of cost but now senior management buy in is much more forthcoming due to the regulatory and client focus on the quantification of risk management. As noted by Mizuho International’s risk management chief operating officer Simon Tweddle, firms need to produce numbers to prove they have been engaged in practices such as stress testing. This means silos must be broken down in order to gather together the multiple sources of relevant data from across the business.

Simon Trewin, independent consultant, added that the desire for internal transparency is one of the biggest drivers for data management investment in many banks. The shock of Lehman has left many firms aware of their internal risk management shortcomings and even the front office is taking an active interest in risk management practices. “The front office wants more visibility around the management of capital and counterparty risk exposure in order to make smarter decisions,” sais Trewin.

Mizuho’s Tweddle added that the tension between timeliness and accuracy with regards to data is also a challenge facing financial institutions. He recommended the setting of tolerances and metrics in order to monitor data quality overall and the need for a more detailed management information system (MIS) with established data attributes.

Of course, this desire for more data on the part of the front office means that data management within downstream systems is more important than ever before. Consistent use of data standards across an organisation is therefore integral to the smooth operation of risk management systems.

However, the lack of standardisation across the industry is a key sticking point in this endeavour, said Di Giammarino. “The ISO process is a starting point and it is open but the industry needs to figure out the standards that are fit for purpose first,” he explained.

JWG has been speaking to the Committee of European Securities Regulators (CESR) in order to begin the standardisation process, he continued. The group has spent a month looking at address formats and is now examining naming conventions.

Despite this regulatory involvement, Tweddle warned that firms should not wait for a policy statement before they take action to deal with data quality. Especially given the regulatory community’s bent towards using the stick of late.

Related content

WEBINAR

Recorded Webinar: Managing the transaction reporting landscape post Brexit: MiFID II, SFTR, EMIR

The transaction reporting landscape has, for many financial institutions, expanded considerably in size since the end of the UK’s Brexit transition period on 31 December 2020 and the resulting need for double reporting of some transactions to both EU and UK authorities. It has also changed dramatically following the UK government’s failure to reach equivalence...

BLOG

Refinitiv Adds Country SDG Scores to ESG Data Arsenal

Refinitiv’s launch of measures assessing the sustainable development performance of individual countries is the latest of a series of services targeted at the burgeoning ESG investing space. The addition allows Refinitiv to offer asset managers and servicers scores that indicate the degree to which funds, corporations and now countries meet the UNs Sustainable Development Goals...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...