Firms need to get a good handle on the regulations coming down the pipe and how they will impact the data management function, as well as prioritising them in order of timeframes, said Chris Bannocks, managing director and global head of reference data, Operations, Nomura International. Speaking at this week’s JWG organised event on next year’s regulatory data related challenges, Bannocks noted that lead time is a key consideration in deciding when to launch individual regulatory driven data projects.
He explained: “Part of the challenge is in dealing with the timeframes involved in regulatory change – when will the regulation hit and how long is the lead time to prepare for the changes? Also, how quickly will the changes impact your firm once the regulation has been passed? You need to understand the changes that will occur across the industry landscape, prioritise those that will impact you the soonest and develop a strategy to respond to those changes.”
Bannocks described getting a firm’s fundamental reference data right as “a huge challenge” and said that firms need to prioritise their projects in light of the regulations coming down the pipe. As previously noted by A-Team, this indicates that a tactical approach to dealing with the incoming veritable cartload of regulatory requirements is likely to persist within the market. After all, firms have little time and therefore little choice in the matter when faced with a wall of regulation.
Fellow panellist Colin Rickard, EMEA managing director of vendor DataFlux, added that he doesn’t see any prospect of strategic approaches being adopted wholesale by the industry in the near future. Tactical implementations of data management solutions will persist due to pressures related to time and cost, although certain areas such as the data supports for credit risk infrastructures will see an increase in funding, he suggested.
However, Rickard warned that the industry needs to move from the current state of a “spreadsheet tyranny” to be able to rely on the fact that data is consistent and accurate across an organisation. “There has been much more attention directed at the idea of benchmarking around data quality,” he said. There is a common seam across regulations such as Basel III and Solvency II around proving that the underlying data is “accurate, appropriate and complete,” which helps the data management cause.
Bannocks added: “We also need to recognise that the industry will not stand still during this time and we will need to adapt as we go along to react to changes such as M&A activity. This is why it is so important to embed policies and data governance strategies into these projects and your technology deployment. Getting your workflow right is one thing but if your governance practices don’t also exist in your IT processes (integration, and matching of similar counterparties for instance) then your data quality will suffer.”
In terms of advice for firms tackling the issue of data quality, Bannocks had some advice: “In order to ensure data quality, you need to get as close to the source as possible and provide a high degree of transparency around your data processes to ensure you have traceability back to source as a minimum. The more layers you introduce to your process (in particular transformation and mapping) the less likely clear traceability will exist.”
Panellists noted that the fundamental argument for investment in data management hasn’t changed much over the last 10 years, except that now there is regulatory compulsion to drive forward that investment. Regulation can therefore pose both a threat and an opportunity to the data management function going forward.
There are real dangers as a result of the timelines involved in regulatory change, however, especially in the area of standards development. As noted recently by A-Team, there needs to be a proper discussion about the global impacts of current standards initiatives beyond just those in the data management sphere: the business needs to be involved too.
Bannocks is of the view that individual national regulators should not act alone in the standards space: “In terms of standardisation, from my perspective, we should push for global standards to be adopted rather than the regulated development of regional standards, which will prove to create more problems than it solves.”
Now is the time for communication on the standards issue, before it runs out. To this end, the industry has just under two months to respond to the proposals regarding how firms will report data to the US Office of Financial Research, including the standards that must be used for identifying counterparties.