By Brian Sentance, CEO, Xenomorph
The siloed delivery of data and technology has evolved quite naturally in most financial institutions. Whether the split lies across business units, geographic boundaries, asset classes or job functions – the complexity of most large organisations has led different teams to pursue their own unique requirements and procurement decisions regarding IT and data.
But a string of recent regulatory initiatives and standards have begun driving convergence across previously siloed systems and data architectures. They include BCBS 239, the Prudent Valuation standards contained in the EU’s Capital Requirements Regulation (CRR); the Basel Committee’s Fundamental Review of the Trading Book (FRTB) and new accounting standards like IFRS 9 – all of which are breaking down boundaries between front-office, risk and finance systems, and leading firms to adopt a consistent framework when it comes to enterprise data management (EDM).
Consistent Framework, Unique Requirements
However, the need for a consistent framework does not equate, by any means, to an inflexible framework. Siloed delivery of IT and data evolved for a reason – because different users have different requirements.
Take the offices of the chief risk officer and chief financial officer. Finance departments are typically more deterministic in their approach. They work to produce monthly and quarterly snapshots, requiring data that conforms to strict rules, such as the official exchange closing price from a particular market, or OTC prices from the front-office that are verified against independent third-party sources.
Risk managers, by contrast, tend to be more probabilistic in their approach. They may need similar pricing data, but their data needs to be available more frequently – at least on a daily basis – and aligned not just at market close, but at consistent times across all markets. Equally, they will require granular time series data stretching back over long periods to help model different scenarios and predict the likelihood of future events.
Given these marked differences, any EDM system seeking to bridge the gap and drive consistency across departmental functions will need to accommodate their different requirements and ways of defining data quality, along with timeframes for data preparation.
IFRS 9: Data Management Challenges
When it comes to IFRS 9, which is due for mandatory adoption from early next year (for reporting periods beginning after 1January 2018), data management challenges principally focus on three core areas: asset and liability classification; asset impairment and expected credit losses; and hedge accounting.
Each of these areas will require some level of process reengineering, along with a consistent framework to manage data that has been sourced and/or managed by different departments. We recently published a whitepaper going into these challenges in more detail, but here is a quick summary:
1. Asset classification: Firms will need to classify assets and liabilities based on their cashflow characteristics and the business model under which they are held, which in turn will determine their accounting treatment. While the classification model is, in principle, more straightforward than the one it replaces (IAS 39), there will still be nuanced circumstances that pose data management and governance challenges when applying them in practice. Keeping records to ensure principles are followed accurately will also be a challenge.
2. Asset impairment and expected credit losses: Under IFRS 9, financial statements will become significantly more sensitive to any impairment in credit quality, as firms are forced to recognise larger portions of expected credit losses up front. Given this heightened sensitivity to credit risk models, the Basel Committee has issued its guidance on accounting for credit losses, stressing the importance of validating model inputs, design and outputs (BCBS 350 – Principle 5). This points to a growing requirement to adopt EDM systems with the necessary capabilities.
3. Hedge accounting: Hedge accounting will also have significant data management implications. Firms will need to model the relationship between each hedged item and its corresponding hedging instruments, ensure all qualifying criteria are met, record proper documentation from the outset of the hedge, and monitor the hedging relationship to ensure it remains effective. These are all tasks that will require robust data management tools and processes.
Although some aspects of IFRS 9 seem to be more pragmatic and principles-based (such as the asset classification model), others will require significantly more heavy lifting. Probably the biggest impact will be felt in the measurement of asset impairment (a recent EBA impact assessment predicted European banks would need to increase their credit loss provisions by 18%).
Firms required to comply will need to ensure they have systems and processes in place to validate data inputs and outputs of their credit risk models as financial statements become significantly more sensitive to any deterioration in credit quality. Even if the Tier 1 capital impact of IFRS 9 ends up being phased in (as proposed by the EC), the increased transparency of expected credit losses means data quality will be a concern from day one.
Given that International Financial Reporting Standards are well adopted across the globe (IFRSs are required in the majority of G20 countries, although notable exceptions include the US and Japan), and that IFRS 9 impacts a broad range of firms (buy-side as well as sell-side), we would expect a significant amount of work over the course of this year to ensure systems, processes and underlying data architectures are in place to support compliance.
Subscribe to our newsletter