About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The New Cross-Asset Trading Challenge

Subscribe to our newsletter

The traditional separation of asset classes into distinct business organizations with incompatible trading systems is an idea whose time has ended. Today the growing quest to automate every variety of asset trading – and in so doing to make possible the trading of assets among different classes – has created a growing business case for cross asset systems. Still, business case notwithstanding, it remains all too easy to fail at cross-asset trading. Here are some tips on how to succeed.

Avoiding the New Data Silo Pitfall

The old asset silos were defined by their data record structures: equities, fixed income, FX and other instruments each had their separate rules. Today we can conceive of it all as instrument data. There’s a metadata structure within which all instruments can be defined. That’s progress. So let’s be careful not to trade the old data silos for a new misconception: that streaming, in-memory and historical data must be managed separately and are fundamentally incompatible.

To head in that direction is to go backwards, and works against the vision of a streamlined, fast and integrated cross-asset trading system. Instead of data silos, what’s needed is a uniform data infrastructure capable of handling the different “states” in the automated life of today’s trade data. It’s all data. It’s just changing state.

A State-Capable Uniform Infrastructure

The states of data in a uniform infrastructure are these:

* Raw market feed data turns into the real-time data stream on which complex events are calculated to determinewhat action to take.

* Trades are executed machine-to-machine and captured in-memory throughout the day

* At end of day (or perhaps several times a day with a global trading system) data is saved to history on disk.

The confusion firms sometimes face in creating a cross asset infrastructure is in thinking that separate technologies are needed to handle the various states of data: market data feeds, data streams, in-memory calculations and historical databases. That would be the silo mistake all over again. The data incompatibilities would lead to endless complications and unacceptable latency.

Managing the Risk

A major benefit of trading assets across classes is that the firm has more ways to manage risk. If you can offset a fixed income trade against an equities trade, for internal eporting purposes, your exposure is potentially less. So is the amount of capital you are required to hold in reserve.

In order to get the full benefit of this kind of risk management, you want to be able to perform it pre-trade. With a uniform trading infrastructure you can calculate the potential effect on your positions and your P&L from each trade and ensure a correctly balanced portfolio in real time.

When your uniform trading infrastructure is high speed and low latency, you can manage risk in real time. There’s no more waiting for overnight batch reports to complete so you can find out in the morning what happened 12-24 hours ago.

Risking the Regulation

The full regulatory picture for cross-asset trading is as yet unwritten. But there’s one best practice to follow now. That is to capture and store all raw data and then audit what happens to it. The requirement to keep raw data is relatively recent, even with equities. But with the onset of MiFID and RegNMS, firms now face storing the raw data that some used to toss at the end of day.

Establishing raw data storage as a best practice for cross-asset trading will go a long way to ensure that cross-asset systems are able to comply with future regulations. Maintaining raw as well as audited data throughout the trading day and into history again requires a data management infrastructure that is seamless, fast and accommodating of the different states of streaming, in memory and historical data.

Creating a cross-asset trading system gives firms new sources of profitable trades while managing risk exposure. The key is to keep latency low and speed high. During the past several years, numerous firms have tried and failed to implement cross-asset trading, because they underestimated the challenges of integration. Replacing asset silos with data state silos is not the answer. The firms that are succeeding understand the ideal pairing of maximized integration and minimized latency.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Reviewing the Latency Landscape and the Next Generation of Ultra-Low Latency Infrastructure

Date: 17 September 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Ultra-low latency is no longer the preserve of a handful of proprietary trading firms. As new asset classes electronify, data volumes surge, and regulatory expectations around execution quality and resilience tighten, the performance demands on trading infrastructure are broadening...

BLOG

Seven 2026 RegTech Outlooks for Compliance, Reporting and Financial Crime

As 2026 gets underway, RegTechs are positioning for a shift in regulatory emphasis from refits, rewrites and attestations to demonstrable evidence. Across the jurisdictions supervisors are shifting from consultation and rulemaking into validation and testing whether firms have operationalised reforms through governance, high-quality data, defensible controls and credible evidence. The seven RegTechs that follow have...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...