About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Markets Must Give Bonds a Data Licence to Thrill!

Subscribe to our newsletter

By Neill Vanlint, Global Head of Sales at GoldenSource. 

Nothing causes more heated debate than a new bond. As the spectre of the world’s central banks global response to COVID-19 continues, today marks the U.S Treasury’s latest attempt to weather the economic storm by issuing $54 billion worth of debt in the form of 20-year bonds. Depending on market interest, this new bond could become a regular fixture in the U.S Treasuries monetary policy strategy. But for many longer-term investors, underpinning the excitement of today’s issuance is a very specific data problem that has been brewing for some time across the global Fixed Income markets.

Trading Fixed Income, like all asset classes, is only as good as the data that underpins the instruments. And for every asset type, there is a very different set of attributes required. Therefore, it would stand to reason that whenever a bond is issued, the financial institution trading it would always need an accurate issue and a maturity date. Or, if it was an interest paying bond, the same institutions would require schedules to be present. The trouble is that too many financial institutions have all this information scattered across different systems with no mandatory checks being carried out. As such, the data being used to make trading decisions is very often incorrect. As a case in point, one system could be defining a perpetual bond as having a maturity date. This, despite the fact that a perpetual bond, by definition, should never have a maturity date as there is no end date for when it is going to mature.

While this may seem like an extreme error, it is typical of the type of inconsistencies that currently reside in the data systems of firms trading bonds. The complexities of the information sourced from the various market data vendors often means the data housed across various systems is very different. After all, it is very rare that multiple systems within a bank all validate data in the same way. Not only does this lead to operational inconsistences but it could also lead to inaccurate risk calculations. Particularly if a bank is making trading decisions based on a maturity date on a bond that does not actually exist.

Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their vendors through independent verification across data sources. This includes collating, ranking and defining those bond instrument types in a central place where records can be audited easily – not across numerous systems. Only by applying rules to detect issues within Fixed Income before they occur can financial institutions harbour any hopes of demonstrating responsibility in their trading decisions to regulators and investors. The launch of any new bond always creates excitement about new enhanced liquidity entering the market. Today’s 20-year U.S treasury bond is no exception, but, unless market participants overcome data errors across their existing systems, their wider Fixed Income trading strategies will be a bit like the financial equivalent of George Lazenby, unable to live up to expectations.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

BridgePort Launches BridgePort Analytics with AI-Driven Exchange Intelligence Assistant

BridgePort, the middleware coordination layer for off-exchange settlement (OES) in institutional crypto, has launched BridgePort Analytics, an exchange intelligence platform designed to support institutional trading firms operating in OES environments. The platform includes Bridget, an AI-powered assistant that allows users to query execution and venue data using natural language. According to the company, BridgePort Analytics...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...