About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Markets Must Give Bonds a Data Licence to Thrill!

Subscribe to our newsletter

By Neill Vanlint, Global Head of Sales at GoldenSource. 

Nothing causes more heated debate than a new bond. As the spectre of the world’s central banks global response to COVID-19 continues, today marks the U.S Treasury’s latest attempt to weather the economic storm by issuing $54 billion worth of debt in the form of 20-year bonds. Depending on market interest, this new bond could become a regular fixture in the U.S Treasuries monetary policy strategy. But for many longer-term investors, underpinning the excitement of today’s issuance is a very specific data problem that has been brewing for some time across the global Fixed Income markets.

Trading Fixed Income, like all asset classes, is only as good as the data that underpins the instruments. And for every asset type, there is a very different set of attributes required. Therefore, it would stand to reason that whenever a bond is issued, the financial institution trading it would always need an accurate issue and a maturity date. Or, if it was an interest paying bond, the same institutions would require schedules to be present. The trouble is that too many financial institutions have all this information scattered across different systems with no mandatory checks being carried out. As such, the data being used to make trading decisions is very often incorrect. As a case in point, one system could be defining a perpetual bond as having a maturity date. This, despite the fact that a perpetual bond, by definition, should never have a maturity date as there is no end date for when it is going to mature.

While this may seem like an extreme error, it is typical of the type of inconsistencies that currently reside in the data systems of firms trading bonds. The complexities of the information sourced from the various market data vendors often means the data housed across various systems is very different. After all, it is very rare that multiple systems within a bank all validate data in the same way. Not only does this lead to operational inconsistences but it could also lead to inaccurate risk calculations. Particularly if a bank is making trading decisions based on a maturity date on a bond that does not actually exist.

Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their vendors through independent verification across data sources. This includes collating, ranking and defining those bond instrument types in a central place where records can be audited easily – not across numerous systems. Only by applying rules to detect issues within Fixed Income before they occur can financial institutions harbour any hopes of demonstrating responsibility in their trading decisions to regulators and investors. The launch of any new bond always creates excitement about new enhanced liquidity entering the market. Today’s 20-year U.S treasury bond is no exception, but, unless market participants overcome data errors across their existing systems, their wider Fixed Income trading strategies will be a bit like the financial equivalent of George Lazenby, unable to live up to expectations.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

KX and OneMarketData to Merge, Creating a New Force in Capital Markets Data and Analytics

KX, the real-time analytics specialist behind the kdb+ time-series database, is set to merge with OneMarketData, provider of the OneTick market data management and analytics platform. The deal, which follows KX’s acquisition by private equity firm TA Associates in July, brings together two well-established names in capital markets technology under the KX brand. Ashok Reddy,...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Impact of Derivatives on Reference Data Management

They may be complex and burdened with a bad reputation at the moment, but derivatives are here to stay. Although Bank for International Settlements figures indicate that derivatives trading is down for the first time in 10 years, the asset class has been strongly defended by the banking and brokerage community over the last few...