About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JWG’s Di Giammarino Talks up Data Practicalities of Risk and Regulatory Onslaught

Subscribe to our newsletter

The long list of incoming regulations, especially those that will require the reinvention the risk management wheel, have thrown into the spotlight the need for firms to get a better handle on their operational risk, said PJ Di Giammarino, CEO of think tank JWG at the recent Thomson Reuters pricing event in London. Reiterating his mantra about the need to understand “what good looks like” in terms of data standards, Di Giammarino elaborated upon the requirement for a common framework for exposure management.

“A data quality measure is needed in order to be able to improve risk models over time and to prove to regulators that assessments submitted are correct,” he told attendees to the event last week.

With such scrutiny on the cards and in the wake of the credit crunch, firms are beginning to invest in tackling some of their underlying data transparency issues. According to an interactive poll at the event, 56% of attendees said they had seen moderate steps forward being made within their firms with regards to data transparency, and a further 16% had seen significant progress. An unlucky 17% indicated that they believe regulation will not prove to be a catalyst, but 11% were awaiting their projects to get underway.

Di Giammarino’s fellow panellists at the Thomson Reuters event indicated that they believe risk management is having even more of an impact on budgets than regulatory change. The desire to reduce operational risk overall is driving firms to invest in getting a better handle on their data in order to be able to actively set their strategies for managing risk and risk tolerances, they agreed.

Previously, firms have been held back by issues of cost but now senior management buy in is much more forthcoming due to the regulatory and client focus on the quantification of risk management. As noted by Mizuho International’s risk management chief operating officer Simon Tweddle, firms need to produce numbers to prove they have been engaged in practices such as stress testing. This means silos must be broken down in order to gather together the multiple sources of relevant data from across the business.

Simon Trewin, independent consultant, added that the desire for internal transparency is one of the biggest drivers for data management investment in many banks. The shock of Lehman has left many firms aware of their internal risk management shortcomings and even the front office is taking an active interest in risk management practices. “The front office wants more visibility around the management of capital and counterparty risk exposure in order to make smarter decisions,” sais Trewin.

Mizuho’s Tweddle added that the tension between timeliness and accuracy with regards to data is also a challenge facing financial institutions. He recommended the setting of tolerances and metrics in order to monitor data quality overall and the need for a more detailed management information system (MIS) with established data attributes.

Of course, this desire for more data on the part of the front office means that data management within downstream systems is more important than ever before. Consistent use of data standards across an organisation is therefore integral to the smooth operation of risk management systems.

However, the lack of standardisation across the industry is a key sticking point in this endeavour, said Di Giammarino. “The ISO process is a starting point and it is open but the industry needs to figure out the standards that are fit for purpose first,” he explained.

JWG has been speaking to the Committee of European Securities Regulators (CESR) in order to begin the standardisation process, he continued. The group has spent a month looking at address formats and is now examining naming conventions.

Despite this regulatory involvement, Tweddle warned that firms should not wait for a policy statement before they take action to deal with data quality. Especially given the regulatory community’s bent towards using the stick of late.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

ESG Data Tops Executives’ 2025 Shopping Lists

Senior executives at financial institutions expect to direct the biggest boost in their data expenditure plans over the coming year towards ESG information, according to a survey that also found that high-quality data and analytics in all domains is being prioritised for growth. In its third annual Future of Finance survey, Switzerland-based exchange operator SIX also found...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...