About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

LIBOR’s End Should be a New Beginning for Corporate Treasuries’ Data Management

Subscribe to our newsletter

By Neil Sandle, Head of Product Management, Alveo.

For many treasuries, LIBOR (London Interbank Offered Rate) is one of their most critical benchmarks. Together with the exchange rates of major currencies it is an essential piece of data, underpinning contracts worth trillions of dollars.

The long-standing centrality of LIBOR is why well-publicised global moves to end its use are giving corporate treasurers and their teams such major data dilemmas. Termination of LIBOR will have far-reaching consequences for everything from swaps and other derivative contracts to the functioning of supply chain and trade finance and the market in business and personal loans. Accounting, tax, risk-management, funding, liquidity and legal liabilities are all affected by the new reference rates that are set to replace LIBOR.

The staged transition to new benchmarks from the end of 2021, such as the US dollar SOFR or to sterling SONIA and European €STR is proving difficult, however. LIBOR has been part of valuation, pricing and risk management processes for decades and its replacements must establish credibility. LIBOR is forward-looking whereas SOFR, for example, is ‘backward-looking’ (and based on a larger number of underlying transactions), Also, the timing as well as term structure of the new reference rates can be different which causes complications in finance. Building credibility with corporate treasurers is tough and comes at a time when all companies are struggling to readjust after the pandemic.

Companies around the globe are preparing for this major change on all fronts, using new technology to streamline their analysis and optimise their responses. Many firms are, for example, using AI to scan large numbers of documents using natural-language processing to identify legal clauses and obligations relating to LIBOR. Natural language processing will pick out key words and phrases that should alert companies to LIBOR references, indicating they need further examination.

The ‘to do’ list extends much further, however. Companies must work through the full impact of new benchmarks on their operations and cashflow management, assessing which contracts require reassessment and how they can reconfigure their financial exposure. Some jurisdictions such as the State of New York recently introduced legal backstops to manage the transition. Risk-management is critical but depends on analysis of historical data and contextual information such as stock market performance. This is far more problematic when a new benchmark renders historical data less relevant, making risk scenarios difficult to create with confidence. Besides, the transition from one regime of reference rates to another can cause spikes in the historical data.

In the financial sector, firms need to fully digest the new interest rate benchmarks in their market data management. They must reconfigure any kind of derived data, interest rate curves or other instruments that hinge on those benchmarks and service any downstream applications, business users, analytical models and reporting processes using that information. Most of all, they need to pinpoint precisely what they need to change internally as they prepare to move away from LIBOR.

The effectiveness of all these actions depends almost entirely on data quality. Without data quality intelligence, organisations are in danger of adjusting to life after LIBOR using inaccurate or low-quality information.

Firms need complete visibility of their data, its management and its lineage, so they can see immediately where it has come from and are able to assess its quality instantaneously. They must be able to drill down into the data in each process and see where quality is insufficient and remedial action is required. Poor data undermines confidence in risk scenarios to the point they are no longer trusted at all.

Confidence in a smooth transition is essential. When end-users of the data, auditors or regulatory bodies ask questions or make significant inquiries as part of their supervisory role, treasuries need to provide answers that are not just accurate but are credible and convincing.

The problem for many treasuries is that LIBOR reform has come just as they are struggling with economic recovery from a pandemic, and with ever-growing numbers of data sets from a more digitised world. This places a high premium on data integration, which needs to be fully optimised so organisations can make the right business decisions based on hard evidence. Data is, after all, worthless if it is not available for use or has unclear provenance. Data quality and tracking contextual information are key, as is the capability, speed and reliability of the integration and overall preparation for end-users, or to create valuation and risk models, reports and treasury applications.

As the corporate world moves away from LIBOR rates, data services will need to bring a more detailed and more transparent understanding of data lineage – where does data come from, what are the ultimate sources and what quality checks took place on the way. The more that companies understand data flows and data transformation events, the more they can augment and enrich the data sets that treasuries will rely on for everything from day-to-day operations to analytics and risk scenarios that inform board-level strategy.

The traditional information management functions of data-sourcing, mastering and quality management have to be rethought as AI and machine learning models deployed to supplement treasury insight can go wildly awry if fed with inaccurate data. This makes it all the more necessary to track the context, metadata, quality statistics and permissions to understand the context required to prepare data for advanced analytics.

Actionable insight depends on managing metadata with skill, and creating data models aligned to the entire company. The smart deployment of analytics, visualisations, AI models and different forms of advanced automation are how an organisation uncovers such insights.

As the end of LIBOR draws closer, the treasuries that excel and win will be those that source, digest and process data rapidly and effectively, and use the right contextual information. Organisations that fail to readjust will be left behind. Staying ahead demands treasuries reassess the financial data they use for resilience models in spreadsheets and automation projects. To fully understand the impact of LIBOR reform and to focus on optimising their operations beyond it, they must take data management to the next level. Their ability to manage data so they can steer through this significant change without disruption will have a major bearing on how successful corporate treasuries are, well into the future.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Data Warning After UK Signals New Law Covering AI Use

Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill. Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...