About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Markets Must Give Bonds a Data Licence to Thrill!

Subscribe to our newsletter

By Neill Vanlint, Global Head of Sales at GoldenSource. 

Nothing causes more heated debate than a new bond. As the spectre of the world’s central banks global response to COVID-19 continues, today marks the U.S Treasury’s latest attempt to weather the economic storm by issuing $54 billion worth of debt in the form of 20-year bonds. Depending on market interest, this new bond could become a regular fixture in the U.S Treasuries monetary policy strategy. But for many longer-term investors, underpinning the excitement of today’s issuance is a very specific data problem that has been brewing for some time across the global Fixed Income markets.

Trading Fixed Income, like all asset classes, is only as good as the data that underpins the instruments. And for every asset type, there is a very different set of attributes required. Therefore, it would stand to reason that whenever a bond is issued, the financial institution trading it would always need an accurate issue and a maturity date. Or, if it was an interest paying bond, the same institutions would require schedules to be present. The trouble is that too many financial institutions have all this information scattered across different systems with no mandatory checks being carried out. As such, the data being used to make trading decisions is very often incorrect. As a case in point, one system could be defining a perpetual bond as having a maturity date. This, despite the fact that a perpetual bond, by definition, should never have a maturity date as there is no end date for when it is going to mature.

While this may seem like an extreme error, it is typical of the type of inconsistencies that currently reside in the data systems of firms trading bonds. The complexities of the information sourced from the various market data vendors often means the data housed across various systems is very different. After all, it is very rare that multiple systems within a bank all validate data in the same way. Not only does this lead to operational inconsistences but it could also lead to inaccurate risk calculations. Particularly if a bank is making trading decisions based on a maturity date on a bond that does not actually exist.

Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their vendors through independent verification across data sources. This includes collating, ranking and defining those bond instrument types in a central place where records can be audited easily – not across numerous systems. Only by applying rules to detect issues within Fixed Income before they occur can financial institutions harbour any hopes of demonstrating responsibility in their trading decisions to regulators and investors. The launch of any new bond always creates excitement about new enhanced liquidity entering the market. Today’s 20-year U.S treasury bond is no exception, but, unless market participants overcome data errors across their existing systems, their wider Fixed Income trading strategies will be a bit like the financial equivalent of George Lazenby, unable to live up to expectations.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The future of market data – Harnessing cloud and AI for market data distribution and consumption

Market data is the lifeblood of trading, but as data volumes grow and real-time demands increase, traditional approaches to distribution and consumption are being pushed to their limits. Cloud technology and AI-driven solutions are rapidly transforming how financial institutions manage, process, and extract value from market data, offering greater scalability, efficiency, and intelligence. This webinar,...

BLOG

DiffusionData Targets Agentic AI in Finance with New MCP Server

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...