Following on from its implementation of EDM vendor SmartCo’s DataHub platform earlier this year to support the reference data requirements of its Global Banking and Markets division, Royal Bank of Scotland (RBS) is continuing to build up its data management capabilities with a view to better supporting its risk management function. Speaking at last week’s JWG regulatory data event, Mark Davies, head of reference data for SSF Risk Services at RBS, elaborated on the progress and aims of the firm’s One Risk project, which he first elaborated upon back in September this year.
When Reference Data Review spoke to Davies earlier this year, he explained that the firm is working to improve the overall quality of its reference data in order to be able to support functions such as risk management. The focus continues to be (much as it was then) on understanding RBS’ current internal capabilities and gradually building on these capabilities in order to enhance the management of the underlying reference data.
Davies indicates that the aim is to be better prepared to deal with the incoming barrage of regulatory requirements by dealing with these in a more structured and centralised manner, hence the birth of the One Risk project. Much like many other projects going on within the walls of its contemporaries (see details of Deutsche Bank’s silo reduction drive), RBS is developing a more joined up approach across silos. “Regulation is coming our way at a pace and we need to be able to react to that; it is not optional,” warned Davies at last week’s event.
He also noted that the global nature of most financial institutions complicates matters from a regulatory standpoint, as different jurisdictional requirements come into play. “Where a firm is headquartered is less important than it once was because firms are required to be present in multiple geographies and to diversify into multiple asset classes. Developments across the globe impact most large financial institutions in some way,” he said.
RBS is seemingly hoping to ensure that there is confidence that the data used for risk management purposes is accurate and well maintained; a prerequisite for survival in the current environment of intense regulatory scrutiny. Data also needs to be consistent across functions such as risk management and finance to ensure that firms can report accurately to regulators, explained Davies. “A consistent view of data across the organisation is the goal of our current project: that the different silos are sharing the same data from a product and function point of view,” he said.
Calling to mind Deutsche Bank’s Sean Taylor’s keynote speech at FIMA this year, Davies explained that the initial focus of any firm should be on getting the reference data basics right at the start. “Being able to report accurately to regulators often comes down to a basic set of data attributes. Getting these attributes correct can allow you to meet 80% of your reporting requirements. However, you also need to understand the interconnectedness between these data items: the interrelationships between the data building blocks,” he said.
However, in today’s market, with fears of a double-dip recession on the horizon, non-regulatory driven projects remain hard to sell within firms. Tying a project to meeting specific regulatory requirements is often much easier in terms of getting sign off for budget, but a lot of important infrastructure focused projects are non-regulatory driven, such as decommissioning legacy systems. Thus data management teams are often stuck between a rock and a hard place. Davies’ advice for firms being forced to launch targeted regulatory driven projects and develop local data solutions was to make sure these are housed centrally.
His indication that RBS is seeking to be on the “front foot” with regards to getting the basics in place is also heartening news for the future (even if RBS is facing its share of negative publicity in the mainstream media at the moment). “Regulatory change is raising the visibility of the data management challenge across firms, but there needs to be better communication with the business about the process,” he told attendees to last week’s event. And it is this issue of communication that comes up time and time again with regards to ensuring that the business understands the reference data story.
But it is not just about internal communication; Davies noted that there is a challenge in determining what to benchmark a firm’s data quality against due to the lack of communication across the industry about standards. After all, technology is only part of the problem.
To this end, Davies reckons firms should begin talking about the incoming entity data standards before it is too late. “The legal entity identification developments are an opportunity to be able to shape the way the industry interacts with the regulatory community,” he contended. The end of January deadline for feedback to the Office of Financial Research proposals is therefore a key date to bear in mind.