By Norbert Boon, global head of solutions at PaceMetrics
While the liquidity requirements of the impending Basel III financial reform have been pared back, the regulatory reporting demands on banks remain a vast challenge – especially in terms of data management. In this article, Norbert Boon, global head of solutions at PaceMetrics, looks at how banks can best cope in the risk averse, more transparent new world order, while maintaining a sharp competitive edge.
As the regulators around the globe clamp down on liquidity and capital, financial institutions are under increasing strain. Not only do they have to hold more capital and liquidity and report on this daily, but they also need to demonstrate the processes and controls used to put together reliable information. Never before have risk and data management been so interwoven and the management of data so important. This places the governance of data firmly under the spotlight.
Siloed business units, manual processes and lack of transparency are sadly still common traits in investment banks today. These have created operational hurdles for many as they scramble to appease the regulators and cope with shifting regulatory goalposts. The gut reaction is to throw more resource at the issue, making compliance severely costly at a time when budgets are strictly limited. Spiralling compliance costs are naturally an undesirable option. Instead, banks need to find a balance between compliance and competitiveness – meeting the regulatory requirements while maintaining business as usual.
Banks have always had to report capital at risk on a daily basis. However, the fines that come from falling foul of the central bank reporting stipulations are relatively small when compared to the high margins many banks now incur. Moreover, the figures on the surface are no longer simply accepted at face value – what’s behind them matters just as much. The regulatory focus now incorporates the audit trail and business process used, which has disrupted everyday operations and brought independence and transparency to the fore.
All too often, sell side institutions find themselves in a situation where they can’t determine a good instrument price from a bad one, or even tell who issued the instrument. If they can’t do this, demonstrating independent pricing is near impossible. This inability can largely be tracked back to poor quality of data and an inflexible approach to data management.
These challenges are compounded by the fact that very often, data management falls under the responsibility of IT. The problem remains though that IT doesn’t have the specific front and middle office insight needed to ensure that data is managed according to business requirements. In these times of stringent regulation, banks need to get their data in order if they are to conquer the art of regulatory reporting. This means taking a business process led approach to data governance, based on the principles of independence and transparency.
When it comes to price data, it could be argued that banks can never be truly independent because prices are sourced externally. However, the buy side’s ramped up approach to valuation verification – on top of the requirements of MiFID II – have greatly increased the need to prove that appropriate checks and controls have been applied.
Banks can only prove this degree of independence by ensuring that business rules have been incorporated into data management. These rules or service level agreements will be specific to the bank – such as the guaranteed delivery times of prices from specific countries. Most banks haven’t yet joined this up to the data management process, which is what makes it so difficult to determine what went wrong or gain an early warning of potential issues. As a result, what could have been addressed as a small discrepancy often snowballs into a big problem.
Equally important is being able to draw up a full audit trail of the data. This should not only show exactly where the data has come from, but must also provide real-time information about all data events, such as errors, new product take-ons and model updates. In addition, such an audit trail will demonstrate that both the business and regulatory rules have been adhered to throughout the process.
Continuous monitoring for data governance of the complete, and preferably centralised, process is also paramount. In taking this measure, banks will achieve better compliance and drive down operational risk by giving business users unparalleled transparency.
These best practices – particularly if conducted across price, reference, counterparty and corporate actions data – are key to mastering the compliance and competitive balancing act I referred to earlier. This level of data governance will help departments that face extensive reporting, such as risk and asset and liability management, meet their daily regulatory obligations more efficiently. They will also enable the institution to plan effectively and dedicate appropriate resource to the more extensive month- or quarter-end reporting. In turn, higher levels of data quality and the ability to demonstrate independence, good governance and controls will go a long way in helping the institution improve its rating with the central bank.