About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG’s Di Giammarino Talks up Data Practicalities of Risk and Regulatory Onslaught

Subscribe to our newsletter

The long list of incoming regulations, especially those that will require the reinvention the risk management wheel, have thrown into the spotlight the need for firms to get a better handle on their operational risk, said PJ Di Giammarino, CEO of think tank JWG at the recent Thomson Reuters pricing event in London. Reiterating his mantra about the need to understand “what good looks like” in terms of data standards, Di Giammarino elaborated upon the requirement for a common framework for exposure management.

“A data quality measure is needed in order to be able to improve risk models over time and to prove to regulators that assessments submitted are correct,” he told attendees to the event last week.

With such scrutiny on the cards and in the wake of the credit crunch, firms are beginning to invest in tackling some of their underlying data transparency issues. According to an interactive poll at the event, 56% of attendees said they had seen moderate steps forward being made within their firms with regards to data transparency, and a further 16% had seen significant progress. An unlucky 17% indicated that they believe regulation will not prove to be a catalyst, but 11% were awaiting their projects to get underway.

Di Giammarino’s fellow panellists at the Thomson Reuters event indicated that they believe risk management is having even more of an impact on budgets than regulatory change. The desire to reduce operational risk overall is driving firms to invest in getting a better handle on their data in order to be able to actively set their strategies for managing risk and risk tolerances, they agreed.

Previously, firms have been held back by issues of cost but now senior management buy in is much more forthcoming due to the regulatory and client focus on the quantification of risk management. As noted by Mizuho International’s risk management chief operating officer Simon Tweddle, firms need to produce numbers to prove they have been engaged in practices such as stress testing. This means silos must be broken down in order to gather together the multiple sources of relevant data from across the business.

Simon Trewin, independent consultant, added that the desire for internal transparency is one of the biggest drivers for data management investment in many banks. The shock of Lehman has left many firms aware of their internal risk management shortcomings and even the front office is taking an active interest in risk management practices. “The front office wants more visibility around the management of capital and counterparty risk exposure in order to make smarter decisions,” sais Trewin.

Mizuho’s Tweddle added that the tension between timeliness and accuracy with regards to data is also a challenge facing financial institutions. He recommended the setting of tolerances and metrics in order to monitor data quality overall and the need for a more detailed management information system (MIS) with established data attributes.

Of course, this desire for more data on the part of the front office means that data management within downstream systems is more important than ever before. Consistent use of data standards across an organisation is therefore integral to the smooth operation of risk management systems.

However, the lack of standardisation across the industry is a key sticking point in this endeavour, said Di Giammarino. “The ISO process is a starting point and it is open but the industry needs to figure out the standards that are fit for purpose first,” he explained.

JWG has been speaking to the Committee of European Securities Regulators (CESR) in order to begin the standardisation process, he continued. The group has spent a month looking at address formats and is now examining naming conventions.

Despite this regulatory involvement, Tweddle warned that firms should not wait for a policy statement before they take action to deal with data quality. Especially given the regulatory community’s bent towards using the stick of late.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

The Case Against Ripping and Replacing: Why Capital Markets Firms Should Build Intelligence Into What They Already Have

By Neil Vernon, Chief Product Officer, Gresham. For years, capital markets firms have faced the same challenge: modernising sprawling, legacy data systems. Each attempt follows a familiar pattern – ambitious platform overhauls, eight-figure budgets, years of disruption – yet the old systems often remain in use long after the new ones are live. Replacing systems...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Preparing For Primetime – How to Benefit from the Global LEI

They say time flies when you’re enjoying yourself, and so it seems the industry have been having a blast with its preparations for the introduction of the global legal entity identifier (LEI) next month. But now it’s time to get serious. To date, much of the industry debate has centred on the identifier itself: its...