About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Riding the Regulatory Wave

Subscribe to our newsletter

Some years ago, I debated with the president of a well known enterprise data management platform provider about whether the market’s obsession with risk represented the ‘killer app’ the EDM segment had long been searching for.

He wasn’t entirely convinced; I, of course, was. And although there are many good reasons to embark on an EDM project, I still believe that the market’s need for robust risk information – and the need to demonstrate that robustness to regulators – remains the single compelling justification for funding of data management projects in capital markets.

My case was bolstered this week by a conversation with Stephen Vinnicombe, partner at Capco, who describes the current scrutiny of our marketplace as “a uniquely intense point for banking.”

Vinnicombe believes data and data management will play an integral part in what he sees as major structural change within banking organisations over the next five years. He believes the current focus on regulatory compliance – citing the incoming Basel III, EMIR and FATCA regs, among others – is forcing banks to reassess their approach to risk reporting and the underlying data it requires.

Current bank thinking, Vinnicombe reckons, is characterised by a lack of understanding of data, data sources and interpretation of data to fulfil the internal and external requirement, both of which he regards as compelling. With respect to the internal requirement, Vinnicombe says the scarcity of capital is forcing firms to choose between trades where before they would execute both. This places greater emphasis on decision-making, which in turn increases the importance of supporting data.

With respect to the external requirement – in essence, the regulatory reporting requirement – Vinnicombe believes the market will undergo massive standardisation in terms of the data it uses in order to meet regulators’ demands. There has been progress in this area. Vinnicombe points to the uptake of XBRL as an example, and of course there is the ongoing effort to establish a global legal entity identifier (LEI) and to adopt ISO 20022 for corporate actions messaging. But Vinnicombe expects regulators to become more prescriptive in terms of the data they want to see from industry participants.

But despite these imperatives, Vinnicombe describes the landscape for funding of risk- and data-related projects as “bleak”. Part of the issue is the sheer volume of regulatory work, which is pushing firms to act, but not necessarily in a strategic way. Too often, Vinnicombe says, firms are ticking boxes and meeting requirements on the cheap. The result is a missed opportunity to reengineer internal data processes for competitive advantage.

Vinnicombe says one major banking organisation is bucking that trend, earmarking some $1 billion a year over the next seven years to reengineer its risk management operation, and the data management infrastructure that underpins it. This project – at an undisclosed Capco client – is “a triumph of governance,” says Vinnicombe. The envisioned result will be an assortment of Holy Grails: real-time P&L, capital impact reports, T+0 reporting and so on.

For the most part, Vinnicombe reckons, firms are approaching the data management funding challenge through stealth – as flagged at our recent Data Management Summits in London. Practitioners are dressing up their data management projects as regulatory compliance projects. If they are joined up in their thinking and implementation, that’s a benefit.

The regulatory wave may be breaking, but there will be more to do, not less, once it’s receded from the beach.

Read a related whitepaper on the subject of Getting a Grip on Fragmented Risk Data – A Holistic Approach to Risk Information here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Duco Offers Transaction Reporting Eligibility Validator for EMIR

Duco, a provider of Software-as-a-Service (SaaS) AI-powered data automation, has released a transaction reporting eligibility validator that provides an independent validation of internal eligibility rules by using field-by-field comparison of individual financial services firms’ rules versus existing ESMA eligibility requirements. The solution highlights under and overreporting using Duco’s reconciliation and exception management capabilities, and regulatory...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2014

Welcome to the inaugural edition of the A-Team Regulatory Data Handbook. We trust you’ll find this guide a useful addition to the resources at your disposal as you navigate the maze of emerging regulations that are making ever more strenuous reporting demands on financial institutions everywhere. In putting the Handbook together, our rationale has been...