The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

LIBOR’s End Should be a New Beginning for Corporate Treasuries’ Data Management

By Neil Sandle, Head of Product Management, Alveo.

For many treasuries, LIBOR (London Interbank Offered Rate) is one of their most critical benchmarks. Together with the exchange rates of major currencies it is an essential piece of data, underpinning contracts worth trillions of dollars.

The long-standing centrality of LIBOR is why well-publicised global moves to end its use are giving corporate treasurers and their teams such major data dilemmas. Termination of LIBOR will have far-reaching consequences for everything from swaps and other derivative contracts to the functioning of supply chain and trade finance and the market in business and personal loans. Accounting, tax, risk-management, funding, liquidity and legal liabilities are all affected by the new reference rates that are set to replace LIBOR.

The staged transition to new benchmarks from the end of 2021, such as the US dollar SOFR or to sterling SONIA and European €STR is proving difficult, however. LIBOR has been part of valuation, pricing and risk management processes for decades and its replacements must establish credibility. LIBOR is forward-looking whereas SOFR, for example, is ‘backward-looking’ (and based on a larger number of underlying transactions), Also, the timing as well as term structure of the new reference rates can be different which causes complications in finance. Building credibility with corporate treasurers is tough and comes at a time when all companies are struggling to readjust after the pandemic.

Companies around the globe are preparing for this major change on all fronts, using new technology to streamline their analysis and optimise their responses. Many firms are, for example, using AI to scan large numbers of documents using natural-language processing to identify legal clauses and obligations relating to LIBOR. Natural language processing will pick out key words and phrases that should alert companies to LIBOR references, indicating they need further examination.

The ‘to do’ list extends much further, however. Companies must work through the full impact of new benchmarks on their operations and cashflow management, assessing which contracts require reassessment and how they can reconfigure their financial exposure. Some jurisdictions such as the State of New York recently introduced legal backstops to manage the transition. Risk-management is critical but depends on analysis of historical data and contextual information such as stock market performance. This is far more problematic when a new benchmark renders historical data less relevant, making risk scenarios difficult to create with confidence. Besides, the transition from one regime of reference rates to another can cause spikes in the historical data.

In the financial sector, firms need to fully digest the new interest rate benchmarks in their market data management. They must reconfigure any kind of derived data, interest rate curves or other instruments that hinge on those benchmarks and service any downstream applications, business users, analytical models and reporting processes using that information. Most of all, they need to pinpoint precisely what they need to change internally as they prepare to move away from LIBOR.

The effectiveness of all these actions depends almost entirely on data quality. Without data quality intelligence, organisations are in danger of adjusting to life after LIBOR using inaccurate or low-quality information.

Firms need complete visibility of their data, its management and its lineage, so they can see immediately where it has come from and are able to assess its quality instantaneously. They must be able to drill down into the data in each process and see where quality is insufficient and remedial action is required. Poor data undermines confidence in risk scenarios to the point they are no longer trusted at all.

Confidence in a smooth transition is essential. When end-users of the data, auditors or regulatory bodies ask questions or make significant inquiries as part of their supervisory role, treasuries need to provide answers that are not just accurate but are credible and convincing.

The problem for many treasuries is that LIBOR reform has come just as they are struggling with economic recovery from a pandemic, and with ever-growing numbers of data sets from a more digitised world. This places a high premium on data integration, which needs to be fully optimised so organisations can make the right business decisions based on hard evidence. Data is, after all, worthless if it is not available for use or has unclear provenance. Data quality and tracking contextual information are key, as is the capability, speed and reliability of the integration and overall preparation for end-users, or to create valuation and risk models, reports and treasury applications.

As the corporate world moves away from LIBOR rates, data services will need to bring a more detailed and more transparent understanding of data lineage – where does data come from, what are the ultimate sources and what quality checks took place on the way. The more that companies understand data flows and data transformation events, the more they can augment and enrich the data sets that treasuries will rely on for everything from day-to-day operations to analytics and risk scenarios that inform board-level strategy.

The traditional information management functions of data-sourcing, mastering and quality management have to be rethought as AI and machine learning models deployed to supplement treasury insight can go wildly awry if fed with inaccurate data. This makes it all the more necessary to track the context, metadata, quality statistics and permissions to understand the context required to prepare data for advanced analytics.

Actionable insight depends on managing metadata with skill, and creating data models aligned to the entire company. The smart deployment of analytics, visualisations, AI models and different forms of advanced automation are how an organisation uncovers such insights.

As the end of LIBOR draws closer, the treasuries that excel and win will be those that source, digest and process data rapidly and effectively, and use the right contextual information. Organisations that fail to readjust will be left behind. Staying ahead demands treasuries reassess the financial data they use for resilience models in spreadsheets and automation projects. To fully understand the impact of LIBOR reform and to focus on optimising their operations beyond it, they must take data management to the next level. Their ability to manage data so they can steer through this significant change without disruption will have a major bearing on how successful corporate treasuries are, well into the future.

Related content


Recorded Webinar: Maximising success when migrating big data and analytics to cloud

Migrating big data and analytics workflows to the cloud promises significant cost savings through efficient use of infrastructure resources and software that scales dynamically based on data volume, query load, or both. These are valuable gains for investment banks, but they can only be fully realised by taking a new approach to architecture and software...


How to use the LEI to Solve your Onboarding Problems and Cut Costs

Client onboarding and lifecycle management are an ongoing problem at many financial institutions, with inefficiencies often caused by layers of technologies and processes added to capture required data and avoid fines when new rules and regulations are introduced. A solution to the problem, which could save the global banking industry billions of dollars a year...


RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...