About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

From Broker Bias to Independent Insight: The Case for Cloud-Native TCA

Subscribe to our newsletter

For years, the path of least resistance for buy-side transaction cost analysis (TCA) was simple: let the broker do it. Historically, asset managers have relied on their execution counterparties to provide post-trade reporting. It was a workflow of convenience. Brokers executed the trades and subsequently provided the analysis on how well they performed.

However, this model has intrinsic flaws. Relying on counterparties to “grade their own homework” introduces inherent bias. Furthermore, for multi-asset desks routing to dozens of providers, the operational reality is a nightmare of fragmentation: execution teams receive disparate reports from multiple brokers, arriving in different formats with varying methodologies, requiring them to cobble together a cohesive view of performance.

To escape this fragmentation and gain an independent view, firms have traditionally faced a new dilemma, a binary choice between building a solution from scratch or buying a rigid third-party tool. However, both of these reactions to the “Broker TCA” problem are now showing signs of strain.

 

The Flawed Alternatives

Once a firm decides to move beyond broker-supplied reports, they typically encounter two distinct roadblocks.

  1. The “Build” Trap: The first instinct for large, data-savvy firms is to bring the capability in-house. This offers the allure of total control and the ability to normalise data across all brokers into a single “golden source”.

    However, the “hidden tax” of this approach is infrastructure. Developing the pipelines to capture, cleanse, and normalise high-quality market data – especially when dealing with granular Level 3 data – is incredibly resource-intensive. Technical teams often find themselves spending the vast majority of their time maintaining the “plumbing” of historical data rather than deriving insights from it.

  2. The “Black Box” Vendor: The alternative has been outsourcing to independent third-party analytics providers. While this provides an unbiased view distinct from the brokers and reduces internal engineering overhead, it often introduces a transparency issue.

Legacy third-party solutions often act as “black boxes,” relying on rigid, predefined methodologies. When a firm cannot see how a benchmark was calculated, or if the methodology is opaque, the trading desk is left without the required control or visibility. This lack of customisation means firms are often forced to fit their trading strategies into the vendor’s box, rather than the other way around.

 

The Third Path: Cloud-Native Data Infrastructure

Forward-thinking asset managers are now moving toward a “third path”: a hybrid model that leverages scalable, cloud-native infrastructure.

This modern architecture focuses on decoupling the data layer from the analytics layer. Instead of buying a closed application, firms are increasingly working with vendors that provide pre-normalised datasets that can be consumed directly into the firm’s own cloud ecosystems, as Tom Jardine, Head of Data Science (Americas) at BMLL, explains to TradingTech Insight.

“When you combine a fully processed, clean dataset with onboarding into Snowflake or Databricks, that’s a major shift,” he says. “Now the asset manager can take the work in-house, achieve the flexibility they want, creating new metrics, but without doing the data engineering. Their transactions sit in Snowflake, and directly underneath is clean, normalised market data. From an engineering standpoint, it becomes relatively simple: write a SQL query pulling transactions from one table and market data from another, run the analytics, and you’re done. Then you can build very sophisticated TCA metrics.”

By utilising high-quality, T+1 normalised datasets, spanning Level 1, 2, and 3 market data, firms remove the need for manual cleaning. This allows quant teams to build custom analytics on top of robust foundations. The heavy lifting of data engineering is outsourced, but the logic and the insights remain fully under the firm’s control.

 

Case Study: Scaling Analytics for a $1.5 Trillion AUM Firm

The operational impact of this architectural shift is measurable. A leading asset management firm with an equity portfolio of over $1.5 trillion in AUM recently partnered with BMLL to overhaul its global TCA framework using this cloud-native approach.

The firm’s objective was to drive an initiative to leverage new datasets to enhance trading performance and minimise market impact. However, their legacy infrastructure struggled to process the necessary depth of data.

Working with BMLL, the firm integrated its proprietary trade data with BMLL’s global market data model directly within the firm’s Snowflake environment. This setup ensured that daily TCA and broker analytics reports were generated automatically, while the firm retained full control, data governance, and security over their proprietary information.

“This particular asset manager was one of the first we worked with, and the turnaround was remarkable,” notes Jardine. “In three to four months, they completely revamped their TCA workflow. They reduced the work to about a tenth of an FTE (full-time engineer), where previously it took three or four FTEs. This process is extremely important for them because they have targets around trading and execution savings – for marketing, investor discussions, and attracting new inflows. Reducing slippage is critical. Having flexibility, custom metrics, no data-engineering burden, and expert support was truly game-changing for them.”

In less than three months, the firm successfully replaced its legacy post-trade analytics system. The efficiency gains reported were significant:

95% Reduction in Data Management: The time execution teams spent managing and cleaning historical data dropped by 95 percent, freeing them to focus on high-value analysis.

Report Velocity: Report generation, a task that previously spanned multiple days, was reduced to three to five minutes.

Democratized Access: Execution insights became available globally across all teams, rather than being siloed in a specific quantitative unit or dependent on legacy vendor tools.

 

From Data Management to Alpha Generation

By moving away from fragmented broker reports, rigid third-party tools, and unsustainable in-house builds, asset managers are finding they can unlock cost savings and empower their teams to deliver more value from their data.

The benefits are clear, according to Jardine: “It’s about person-hours, the ability to redirect staff toward analytics instead of data munging. It’s accuracy, because all the heavy lifting to create accurate datasets has been done. And it’s cost, especially around storage, because storing petabytes of market data is enormously expensive.”

This shift to cloud-native data consumption represents a fundamental change in how resources are allocated within the buy-side. By adopting modern infrastructure, firms can reduce their total cost of ownership and streamline operations. More importantly, it enables technical teams to stop managing data pipelines and start delivering the execution insights that drive competitive advantage.

Ultimately, the “third path” allows asset managers to focus their efforts on generating better trading outcomes rather than maintaining the plumbing of historical data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

LTX Launches BondGPT Intelligence to Deepen AI Integration in Bond Trading Workflows

LTX, the AI-powered corporate bond trading platform backed by Broadridge Financial Solutions Inc., has launched BondGPT Intelligence, a new capability that embeds generative AI directly into the trading workflow. The functionality is designed to anticipate users’ needs in real time and deliver targeted insights without requiring them to leave the platform. According to Jim Kwiatkowski,...

EVENT

Eagle Alpha Alternative Data Conference, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...