About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Markets Must Give Bonds a Data Licence to Thrill!

Subscribe to our newsletter

By Neill Vanlint, Global Head of Sales at GoldenSource. 

Nothing causes more heated debate than a new bond. As the spectre of the world’s central banks global response to COVID-19 continues, today marks the U.S Treasury’s latest attempt to weather the economic storm by issuing $54 billion worth of debt in the form of 20-year bonds. Depending on market interest, this new bond could become a regular fixture in the U.S Treasuries monetary policy strategy. But for many longer-term investors, underpinning the excitement of today’s issuance is a very specific data problem that has been brewing for some time across the global Fixed Income markets.

Trading Fixed Income, like all asset classes, is only as good as the data that underpins the instruments. And for every asset type, there is a very different set of attributes required. Therefore, it would stand to reason that whenever a bond is issued, the financial institution trading it would always need an accurate issue and a maturity date. Or, if it was an interest paying bond, the same institutions would require schedules to be present. The trouble is that too many financial institutions have all this information scattered across different systems with no mandatory checks being carried out. As such, the data being used to make trading decisions is very often incorrect. As a case in point, one system could be defining a perpetual bond as having a maturity date. This, despite the fact that a perpetual bond, by definition, should never have a maturity date as there is no end date for when it is going to mature.

While this may seem like an extreme error, it is typical of the type of inconsistencies that currently reside in the data systems of firms trading bonds. The complexities of the information sourced from the various market data vendors often means the data housed across various systems is very different. After all, it is very rare that multiple systems within a bank all validate data in the same way. Not only does this lead to operational inconsistences but it could also lead to inaccurate risk calculations. Particularly if a bank is making trading decisions based on a maturity date on a bond that does not actually exist.

Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their vendors through independent verification across data sources. This includes collating, ranking and defining those bond instrument types in a central place where records can be audited easily – not across numerous systems. Only by applying rules to detect issues within Fixed Income before they occur can financial institutions harbour any hopes of demonstrating responsibility in their trading decisions to regulators and investors. The launch of any new bond always creates excitement about new enhanced liquidity entering the market. Today’s 20-year U.S treasury bond is no exception, but, unless market participants overcome data errors across their existing systems, their wider Fixed Income trading strategies will be a bit like the financial equivalent of George Lazenby, unable to live up to expectations.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

Broadridge Deepens AI Push with Minority Investment in DeepSee to Transform Post-Trade Operations

Broadridge Financial Solutions has taken a minority stake in agentic AI specialist DeepSee and expanded its partnership to embed intelligent automation into post-trade workflows, marking a strategic advance in its data and AI roadmap for capital markets operations. Tom Carey, President of Broadridge Global Technology and Operations (GTO), will join DeepSee’s Board of Directors as...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...