About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG’s Di Giammarino Talks up Data Practicalities of Risk and Regulatory Onslaught

Subscribe to our newsletter

The long list of incoming regulations, especially those that will require the reinvention the risk management wheel, have thrown into the spotlight the need for firms to get a better handle on their operational risk, said PJ Di Giammarino, CEO of think tank JWG at the recent Thomson Reuters pricing event in London. Reiterating his mantra about the need to understand “what good looks like” in terms of data standards, Di Giammarino elaborated upon the requirement for a common framework for exposure management.

“A data quality measure is needed in order to be able to improve risk models over time and to prove to regulators that assessments submitted are correct,” he told attendees to the event last week.

With such scrutiny on the cards and in the wake of the credit crunch, firms are beginning to invest in tackling some of their underlying data transparency issues. According to an interactive poll at the event, 56% of attendees said they had seen moderate steps forward being made within their firms with regards to data transparency, and a further 16% had seen significant progress. An unlucky 17% indicated that they believe regulation will not prove to be a catalyst, but 11% were awaiting their projects to get underway.

Di Giammarino’s fellow panellists at the Thomson Reuters event indicated that they believe risk management is having even more of an impact on budgets than regulatory change. The desire to reduce operational risk overall is driving firms to invest in getting a better handle on their data in order to be able to actively set their strategies for managing risk and risk tolerances, they agreed.

Previously, firms have been held back by issues of cost but now senior management buy in is much more forthcoming due to the regulatory and client focus on the quantification of risk management. As noted by Mizuho International’s risk management chief operating officer Simon Tweddle, firms need to produce numbers to prove they have been engaged in practices such as stress testing. This means silos must be broken down in order to gather together the multiple sources of relevant data from across the business.

Simon Trewin, independent consultant, added that the desire for internal transparency is one of the biggest drivers for data management investment in many banks. The shock of Lehman has left many firms aware of their internal risk management shortcomings and even the front office is taking an active interest in risk management practices. “The front office wants more visibility around the management of capital and counterparty risk exposure in order to make smarter decisions,” sais Trewin.

Mizuho’s Tweddle added that the tension between timeliness and accuracy with regards to data is also a challenge facing financial institutions. He recommended the setting of tolerances and metrics in order to monitor data quality overall and the need for a more detailed management information system (MIS) with established data attributes.

Of course, this desire for more data on the part of the front office means that data management within downstream systems is more important than ever before. Consistent use of data standards across an organisation is therefore integral to the smooth operation of risk management systems.

However, the lack of standardisation across the industry is a key sticking point in this endeavour, said Di Giammarino. “The ISO process is a starting point and it is open but the industry needs to figure out the standards that are fit for purpose first,” he explained.

JWG has been speaking to the Committee of European Securities Regulators (CESR) in order to begin the standardisation process, he continued. The group has spent a month looking at address formats and is now examining naming conventions.

Despite this regulatory involvement, Tweddle warned that firms should not wait for a policy statement before they take action to deal with data quality. Especially given the regulatory community’s bent towards using the stick of late.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Nasdaq CSD Renews LEI Service Platform

Nasdaq CSD, a central securities depository providing access to the Estonian, Icelandic, Latvian and Lithuanian markets, and an accredited Legal Entity Identifier (LEI) issuer in the Baltic states and Nordics, has released an enhanced LEI service platform, Nasdaq LEI, that provides more straightforward and effective LEI code issuance and management to help firms meet regulatory...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...