The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Prioritise Your Projects in Light of the Regulations Coming Down the Pipe, Says Nomura International’s Bannocks

Firms need to get a good handle on the regulations coming down the pipe and how they will impact the data management function, as well as prioritising them in order of timeframes, said Chris Bannocks, managing director and global head of reference data, Operations, Nomura International. Speaking at this week’s JWG organised event on next year’s regulatory data related challenges, Bannocks noted that lead time is a key consideration in deciding when to launch individual regulatory driven data projects.

He explained: “Part of the challenge is in dealing with the timeframes involved in regulatory change – when will the regulation hit and how long is the lead time to prepare for the changes? Also, how quickly will the changes impact your firm once the regulation has been passed? You need to understand the changes that will occur across the industry landscape, prioritise those that will impact you the soonest and develop a strategy to respond to those changes.”

Bannocks described getting a firm’s fundamental reference data right as “a huge challenge” and said that firms need to prioritise their projects in light of the regulations coming down the pipe. As previously noted by A-Team, this indicates that a tactical approach to dealing with the incoming veritable cartload of regulatory requirements is likely to persist within the market. After all, firms have little time and therefore little choice in the matter when faced with a wall of regulation.

Fellow panellist Colin Rickard, EMEA managing director of vendor DataFlux, added that he doesn’t see any prospect of strategic approaches being adopted wholesale by the industry in the near future. Tactical implementations of data management solutions will persist due to pressures related to time and cost, although certain areas such as the data supports for credit risk infrastructures will see an increase in funding, he suggested.

However, Rickard warned that the industry needs to move from the current state of a “spreadsheet tyranny” to be able to rely on the fact that data is consistent and accurate across an organisation. “There has been much more attention directed at the idea of benchmarking around data quality,” he said. There is a common seam across regulations such as Basel III and Solvency II around proving that the underlying data is “accurate, appropriate and complete,” which helps the data management cause.

Bannocks added: “We also need to recognise that the industry will not stand still during this time and we will need to adapt as we go along to react to changes such as M&A activity. This is why it is so important to embed policies and data governance strategies into these projects and your technology deployment. Getting your workflow right is one thing but if your governance practices don’t also exist in your IT processes (integration, and matching of similar counterparties for instance) then your data quality will suffer.”

In terms of advice for firms tackling the issue of data quality, Bannocks had some advice: “In order to ensure data quality, you need to get as close to the source as possible and provide a high degree of transparency around your data processes to ensure you have traceability back to source as a minimum. The more layers you introduce to your process (in particular transformation and mapping) the less likely clear traceability will exist.”

Panellists noted that the fundamental argument for investment in data management hasn’t changed much over the last 10 years, except that now there is regulatory compulsion to drive forward that investment. Regulation can therefore pose both a threat and an opportunity to the data management function going forward.

There are real dangers as a result of the timelines involved in regulatory change, however, especially in the area of standards development. As noted recently by A-Team, there needs to be a proper discussion about the global impacts of current standards initiatives beyond just those in the data management sphere: the business needs to be involved too.

Bannocks is of the view that individual national regulators should not act alone in the standards space: “In terms of standardisation, from my perspective, we should push for global standards to be adopted rather than the regulated development of regional standards, which will prove to create more problems than it solves.”

Now is the time for communication on the standards issue, before it runs out. To this end, the industry has just under two months to respond to the proposals regarding how firms will report data to the US Office of Financial Research, including the standards that must be used for identifying counterparties.

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

FCA Completes Transition from Gabriel to RegData Data Collection Platform

The FCA has completed the replacement of its long-standing Gabriel data collection platform with RegData having recently transitioned a final batch of firms to the new platform. In total, 52,000 firms and 120,000 users have been moved from Gabriel to RegData since the transition began in October 2020. RegData is central to the FCA’s data...

EVENT

Data Management Summit London

DMS London brings together the European data management community to explore the latest challenges, opportunities and data innovations facing sell side and buy side financial institutions.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...