About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Data Governance Is Becoming Critical to the Modern Market Data Stack

Subscribe to our newsletter

The rapid expansion of alternative and specialised market datasets is transforming how financial institutions source, analyse and operationalise information. Alongside traditional market data feeds from exchanges and major vendors, firms now consume a growing array of niche datasets supporting quantitative research, AI workflows and investment decision-making.

This surge in data consumption is creating new operational challenges. Managing the cost, usage and licensing obligations associated with an expanding data ecosystem is becoming increasingly complex, particularly as research teams experiment with new datasets outside traditional procurement channels.

Recent research from TRG Screen, the market data management solutions provider, highlights the scale of the issue. Firms adopting purpose-built governance tools have been able to significantly reduce operational overhead and data expenditure while improving visibility into how data is used across the organisation.

Yet the underlying challenge extends beyond cost management. As institutions integrate larger numbers of datasets into automated research and trading workflows, governance of data usage itself is emerging as a critical capability.

Market Data as Intellectual Property

At the core of the issue is the unique commercial structure of market data. Unlike many enterprise technology services, financial datasets are typically licensed as intellectual property with detailed restrictions governing how they may be accessed and used.

“Market data is fundamentally intellectual property,” says Nadine Scott, Chief Customer Strategy Officer at TRG Screen, in conversation with Market & Alt Data Insight. “What firms are buying is the right to consume and use IP, and that creates extremely complex commercial structures. Contracts often define exactly who can use the data, where it can be used and for what purpose. Managing that complexity has become a major operational challenge for financial institutions.”

That complexity becomes harder to manage as data usage spreads across more teams, models and workflows. In many cases, the analytical value of a dataset outweighs consideration of its commercial implications.

“Firms are primarily focused on the value of the data itself – its quality, frequency and how it can be integrated into key workflows such as trading or research,” Scott says. “The commercial aspects are often considered later. But as the number of datasets grows, organisations increasingly need a structured way to manage spend, usage and IP compliance together.”

The Governance Gap

The challenge becomes sharper as data acquisition grows more decentralised. Quant researchers and data science teams increasingly acquire datasets directly for experimentation, often through short-term subscriptions that sit outside traditional market data governance processes.

“You cannot manage what you cannot see,” Scott says. “In many organisations datasets are procured outside the central process, particularly when research teams need data quickly. Creating a single view of market data expenditure helps firms identify duplication, rationalise usage and eliminate waste.”

Scott argues that effective governance depends on combining visibility into spending with visibility into usage and licensing position.

“The key to managing market data effectively is triangulating three things: how much you spend, utilisation and compliance with licensing terms,” explains Scott. “When you can see those three dimensions together, you can begin to understand whether you are paying an amount that reflects the value of that use case.”

For many organisations, however, that remains difficult. Data consumption is often spread across multiple business units, applications and research environments, making it harder to track how datasets are actually used.

The Derived Data Challenge

The problem becomes more complex once datasets are combined, transformed and embedded in models. In modern quantitative workflows, raw datasets are rarely used directly; they are processed, merged with other sources and turned into signals or derived metrics that feed trading strategies and risk models.

“Within market data contracts you often have very specific restrictions on how derived data can be used or redistributed,” Scott says. “The challenge is that those licensing clauses are frequently buried in lengthy agreements, while the actual data usage happens elsewhere in the organisation. Bridging that gap is critical.”

Interpreting whether a derived signal complies with licensing restrictions can therefore be slow and error-prone, especially when the relevant terms are buried in unstructured agreements.

“Instead of teams manually reviewing agreements every time they want to use a dataset in a new way, we are developing AI-enabled solutions to extract the key licensing terms and make them queryable,” says Raushon Uddin, Product Director at TRG Screen. “The goal is that a user can simply ask, ‘Can I use this data for this purpose?’ and get an immediate answer.”

Making licensing terms more accessible at the point of use could reduce compliance risk while allowing research teams to move faster when testing new datasets.

Enabling Data Innovation

Firms are unlikely to slow their use of new datasets. In systematic and quantitative markets, access to differentiated data remains a competitive lever. The issue is not whether firms experiment, but how they do so without creating uncontrolled cost and compliance exposure.

“Market data teams should not be the people saying ‘you cannot do that’,” she says. “Their role should be to help the business navigate the vendor landscape with their expertise and add value to the businesses they support. But they can only do that if they are not buried in administrative work and can focus on higher-value activities.”

That means reducing the administrative burden around contracts, usage monitoring and compliance reporting so market data teams can focus on guidance, negotiation and strategy. Scott’s broader point is that governance should support data-driven innovation, not slow it down.

Governance in an AI-Driven Future

Looking ahead, the governance challenge is likely to intensify as AI-driven workflows become more widespread. Financial institutions are already experimenting with generative AI tools and automated research pipelines designed to accelerate data discovery and feature engineering. Over time, such systems could identify, evaluate and integrate new datasets with minimal human intervention.

“As AI-driven research workflows become faster and more automated, governance will also need to become more automated and policy-driven,” Uddin says. “Relying on manual approvals will become increasingly difficult when datasets are being tested and integrated at scale.”

In that environment, firms will need governance models that can keep pace with faster, more distributed data consumption. Managing the operational lifecycle of data, from procurement and licensing to usage and compliance, is becoming a strategic capability in its own right. Competitive advantage will depend not only on finding new sources of insight, but on governing their use with the same rigour as any other critical part of the data stack.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

Tradeweb and Kalshi Announce Strategic Partnership to Expand Institutional Access to Prediction Markets

Tradeweb, the global operator of electronic marketplaces for rates, credit, equities, and money markets, and Kalshi, the world’s largest prediction market, have formed a strategic partnership to expand institutional access to Kalshi’s prediction market data. The collaboration also includes plans to support institutional-grade event contract trading via Tradeweb’s platform. The announcement brings a regulated prediction...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Institutional Digital Assets Handbook 2023

After initial hesitancy, interest in digital assets from institutional market participants has grown over the past three to four years. Early focus inevitably centred on the market opportunities presented by bitcoin and other cryptocurrencies. But this has evolved into a broad acceptance of a potentially meaningful role for digital assets in institutional markets. It’s now...