About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Markets Must Give Bonds a Data Licence to Thrill!

Subscribe to our newsletter

By Neill Vanlint, Global Head of Sales at GoldenSource. 

Nothing causes more heated debate than a new bond. As the spectre of the world’s central banks global response to COVID-19 continues, today marks the U.S Treasury’s latest attempt to weather the economic storm by issuing $54 billion worth of debt in the form of 20-year bonds. Depending on market interest, this new bond could become a regular fixture in the U.S Treasuries monetary policy strategy. But for many longer-term investors, underpinning the excitement of today’s issuance is a very specific data problem that has been brewing for some time across the global Fixed Income markets.

Trading Fixed Income, like all asset classes, is only as good as the data that underpins the instruments. And for every asset type, there is a very different set of attributes required. Therefore, it would stand to reason that whenever a bond is issued, the financial institution trading it would always need an accurate issue and a maturity date. Or, if it was an interest paying bond, the same institutions would require schedules to be present. The trouble is that too many financial institutions have all this information scattered across different systems with no mandatory checks being carried out. As such, the data being used to make trading decisions is very often incorrect. As a case in point, one system could be defining a perpetual bond as having a maturity date. This, despite the fact that a perpetual bond, by definition, should never have a maturity date as there is no end date for when it is going to mature.

While this may seem like an extreme error, it is typical of the type of inconsistencies that currently reside in the data systems of firms trading bonds. The complexities of the information sourced from the various market data vendors often means the data housed across various systems is very different. After all, it is very rare that multiple systems within a bank all validate data in the same way. Not only does this lead to operational inconsistences but it could also lead to inaccurate risk calculations. Particularly if a bank is making trading decisions based on a maturity date on a bond that does not actually exist.

Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their vendors through independent verification across data sources. This includes collating, ranking and defining those bond instrument types in a central place where records can be audited easily – not across numerous systems. Only by applying rules to detect issues within Fixed Income before they occur can financial institutions harbour any hopes of demonstrating responsibility in their trading decisions to regulators and investors. The launch of any new bond always creates excitement about new enhanced liquidity entering the market. Today’s 20-year U.S treasury bond is no exception, but, unless market participants overcome data errors across their existing systems, their wider Fixed Income trading strategies will be a bit like the financial equivalent of George Lazenby, unable to live up to expectations.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The emerging structure of the institutional digital assets market

As interest in trading digital assets continues to increase among institutional investors, so too does the need to focus on market structure, regulation and trading solutions. For financial institutions that get it right the rewards will be significant, but it is not necessarily easy and the challenges are many. This webinar will consider how digital...

BLOG

Xceptor and Delta Capita Unite to Streamline T+1 Settlement

Data automation platform provider Xceptor has announced a partnership with Delta Capita, a global provider of managed services, technology solutions, and consulting to the financial services industry, aimed at helping financial institutions prepare for the T+1 settlement deadline of 28th May 2024. The partnership, which will leverage Xceptor’s data automation platform and Delta Capita’s expertise,...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...