About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Bridging the Data Monetisation Gap

Subscribe to our newsletter

The strategic argument for treating market data as a product rather than a cost has arguably been won. What remains stubbornly unresolved is what comes next: measuring the return on data investments, breaking the hoarding cultures that prevent data from flowing across the enterprise, and building infrastructure robust enough to support AI at scale.

Those were the central tensions explored by a panel at A-Team Group’s TradingTech Summit London 2026, entitled “Unlocking the Value of Data as a Differentiator – The Data Product Imperative”. The session, moderated by Michelle Ansell, Head of Market Data at Glencore, brought together Barbara Dunbar, EMEA Regional Lead – Market Data, Associate Director (Senior Vice President) at Macquarie; Gunjan Chauhan, Global Head of ETF Capital Markets and Markets Intelligence at State Street Investment Management; Dr. Elliot Banks, Chief Product Officer at BMLL; and John Showell, Head of Strategy and Business Development at Behavox.

The product mindset: horizontal layers and senior buy-in

Panellists described the data product mindset not as a theoretical framework but as an operational reorganisation. One speaker outlined a model in which individual teams have long used data for their own objectives – monitoring execution quality, tracking flows, profiling client behaviour – but the real value emerges when those siloed use cases are connected as a horizontal layer oriented around client outcomes. The example given was an ETF capital markets desk that began by monitoring trading quality for its own products and progressively integrated industry flow data, market share metrics, and the decision-making criteria clients use when selecting between competing exposures. The result was a significantly more targeted approach to resource allocation and client engagement.

Another panellist argued that none of this works without senior management commitment. The firms that succeed, he said, are those where leadership defines exactly the metrics by which the business will be run and holds people accountable against them. The conversation about bringing data systems together then resonates naturally with CFOs, who can see front-office and back-office benefits flowing from the same investment.

Measuring value: the industry’s biggest unsolved problem

An audience poll confirmed that measuring data value and return on investment is the dominant practitioner concern. Panellists acknowledged the difficulty but offered contrasting approaches to the problem. One argued that the cost of managing poor-quality data is itself poorly understood: firms can spend enormous amounts of time cleaning, reconciling and processing data before it reaches a state where it can have any business impact at all. Starting with high-quality data, he suggested, makes everything downstream – including ROI measurement – significantly easier.

Another panellist reframed the measurement challenge by inverting it: rather than asking how to measure data value in the abstract, start with clarity on the desired outcome and work backwards to determine whether the right ingredients – data, talent, tooling – are in place to achieve it.

The most concrete evidence came from a vendor panellist who described a controlled experiment with a customer. The client split its sales team in two: half used the analytics product, half did not. Over two months, the group using the product made 20% more outbound calls, those calls were 22% longer in duration, and the desk saw an 18% uptick in traded volume. The speaker emphasised that building this kind of value case requires deep collaboration and transparency between vendor and client – and that front-office users are typically more motivated by value creation than cost reduction.

Managing costs: from blanket licensing to usage profiling

The panel offered complementary perspectives on the cost challenge. One speaker described a shift away from enterprise-wide blanket licensing toward fit-for-purpose data procurement, where each business unit acquires data specific to its use case and the central market data team acts as a bridge, connecting dots across the organisation and enabling cost-sharing when multiple teams need the same feed. The move from a golden-source architecture to a data mesh – with internal APIs enabling cross-business access – was cited as both a technical and cultural transformation.

From the supplier side, a panellist cautioned against optimising purely on headline data cost. Total cost of ownership, he argued, is the more important metric: a cheaper dataset that requires three engineers and three months of effort to clean and integrate may be the more expensive option overall. High-quality centralised data that can be sliced for different use cases eliminates much of the human capital firms currently pour into data wrangling.

One panellist illustrated the point with an experiment in which users who believed they needed real-time data were switched to a feed with a one- to two-day lag at considerably lower cost. The impact on output was negligible – and in some cases users did not notice the change. The broader lesson, she argued, is that firms need to challenge assumptions about what level of data timeliness each function genuinely requires.

Breaking silos: shared foundations and data lineage

The cultural dimension of data transformation drew sharp commentary. One speaker described the legacy golden-source model as inherently bottlenecked: a single repository feeds all downstream applications, making data lineage difficult to trace and licensing opaque. The replacement – a mesh architecture where business units own their data domains but share through internal APIs – requires an explicit shift from hoarding to sharing. Market data teams, she argued, are uniquely positioned to broker that transition because they have visibility across the entire data estate.

Another panellist extended the argument to the surveillance and front-office space. By meshing unstructured data – phone calls, messaging platforms, emails – with structured trade inquiry data, firms can build a complete picture of customer interaction. In an environment where sales teams are under pressure to service more clients with fewer people, that intelligence allows more junior staff to engage meaningfully because the system surfaces the customer’s history, preferences and needs.

Data provenance and lineage tracking emerged as a persistent gap. While panellists agreed that firms need to know where data enters the organisation, who owns it, and where it flows, the consensus was that no single tool solves this comprehensively. The practical prescription was to embed documentation and ownership tracking as a discipline across market data, technology and business teams rather than waiting for a silver-bullet solution.

AI readiness: data quality as the binding constraint

The discussion on AI reinforced a theme running through the session: that AI tools amplify the quality of their inputs rather than compensating for their deficiencies. One panellist described large language models as highly capable but entirely dependent on the data and documentation they are given. If input data is unreliable, the models will produce poor decisions. If internal documentation is incomplete, the models will expose the gaps – which, he noted, can itself be a useful forcing function for organisational improvement.

Market data was described as particularly challenging in this context. Vendor documentation is often poor, quirks in data feeds are poorly catalogued, and understanding whether a given anomaly is a bug or a feature requires deep institutional knowledge. Building that knowledge into a structured form that AI agents can consume was described as a significant undertaking – but one that would create a durable competitive asset for firms that get it right.

The path forward

The panel closed with pragmatic advice. One speaker urged practitioners to find a senior sponsor and chase value early rather than waiting for a comprehensive data strategy to be complete: firms that can systematically extract value from data will leave competitors behind. Another emphasised knowing your data from inventory to lineage as the prerequisite for any monetisation or optimisation effort. A third kept it simple: start with high-quality data and everything else follows. The final word was a call to develop data champions across all functions, connecting individual silos to unlock the value of the whole.

The overarching message was that the data product imperative is no longer a conceptual ambition – it is an operational challenge. The firms that will differentiate are not those with the best strategy decks but those that can break internal silos, measure outcomes rigorously, and build data foundations robust enough to support the next generation of AI-driven analytics.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

TXSE Selects Exegy FPGA Technology for Market Data Infrastructure

The Texas Stock Exchange (TXSE) has selected Exegy to provide FPGA-based market data feed handlers as part of its launch infrastructure. TXSE is positioning itself as the first fully integrated U.S. equities exchange built from scratch in more than 25 years. As part of that ground-up approach, the venue has opted to deploy FPGA technology...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

What the Global Legal Entity Identifier (LEI) Will Mean for Your Firm

It’s hard to believe that as early as the 2009 Group of 20 summit in Pittsburgh the industry had recognised the need for greater transparency as part of a wider package of reforms aimed at mitigating the systemic risk posed by the OTC derivatives market. That realisation ultimately led to the Dodd Frank Act, and...