The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Riding the Regulatory Wave

Some years ago, I debated with the president of a well known enterprise data management platform provider about whether the market’s obsession with risk represented the ‘killer app’ the EDM segment had long been searching for.

He wasn’t entirely convinced; I, of course, was. And although there are many good reasons to embark on an EDM project, I still believe that the market’s need for robust risk information – and the need to demonstrate that robustness to regulators – remains the single compelling justification for funding of data management projects in capital markets.

My case was bolstered this week by a conversation with Stephen Vinnicombe, partner at Capco, who describes the current scrutiny of our marketplace as “a uniquely intense point for banking.”

Vinnicombe believes data and data management will play an integral part in what he sees as major structural change within banking organisations over the next five years. He believes the current focus on regulatory compliance – citing the incoming Basel III, EMIR and FATCA regs, among others – is forcing banks to reassess their approach to risk reporting and the underlying data it requires.

Current bank thinking, Vinnicombe reckons, is characterised by a lack of understanding of data, data sources and interpretation of data to fulfil the internal and external requirement, both of which he regards as compelling. With respect to the internal requirement, Vinnicombe says the scarcity of capital is forcing firms to choose between trades where before they would execute both. This places greater emphasis on decision-making, which in turn increases the importance of supporting data.

With respect to the external requirement – in essence, the regulatory reporting requirement – Vinnicombe believes the market will undergo massive standardisation in terms of the data it uses in order to meet regulators’ demands. There has been progress in this area. Vinnicombe points to the uptake of XBRL as an example, and of course there is the ongoing effort to establish a global legal entity identifier (LEI) and to adopt ISO 20022 for corporate actions messaging. But Vinnicombe expects regulators to become more prescriptive in terms of the data they want to see from industry participants.

But despite these imperatives, Vinnicombe describes the landscape for funding of risk- and data-related projects as “bleak”. Part of the issue is the sheer volume of regulatory work, which is pushing firms to act, but not necessarily in a strategic way. Too often, Vinnicombe says, firms are ticking boxes and meeting requirements on the cheap. The result is a missed opportunity to reengineer internal data processes for competitive advantage.

Vinnicombe says one major banking organisation is bucking that trend, earmarking some $1 billion a year over the next seven years to reengineer its risk management operation, and the data management infrastructure that underpins it. This project – at an undisclosed Capco client – is “a triumph of governance,” says Vinnicombe. The envisioned result will be an assortment of Holy Grails: real-time P&L, capital impact reports, T+0 reporting and so on.

For the most part, Vinnicombe reckons, firms are approaching the data management funding challenge through stealth – as flagged at our recent Data Management Summits in London. Practitioners are dressing up their data management projects as regulatory compliance projects. If they are joined up in their thinking and implementation, that’s a benefit.

The regulatory wave may be breaking, but there will be more to do, not less, once it’s receded from the beach.

Read a related whitepaper on the subject of Getting a Grip on Fragmented Risk Data – A Holistic Approach to Risk Information here.

Related content

WEBINAR

Recorded Webinar: Improving data integrity to address regulatory requirements

Financial institutions today face a global regulatory landscape characterised by rigorous and varied reporting requirements across their businesses. Reporting challenges include completing more data fields across more lines of business with greater frequency, adding complexity and cost. At the same time, there is waning tolerance among supervisory bodies for errors, issues or delays – as...

BLOG

Hedge Fund Aspect Capital Widens Deployment of OpenGamma for Margin Calculations

Systematic hedge fund Aspect Capital is widening its use of OpenGamma’s derivatives analytics to aid in the calculation of margin requirements as the firm expands into new markets. Aspect recently indicated it is starting operations in China, requiring it to calculate margin for trading on that country’s exchanges. OpenGamma’s analytics cover exchange and broker margin...

EVENT

Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...