About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why Data Governance Is Becoming Critical to the Modern Market Data Stack

Subscribe to our newsletter

The rapid expansion of alternative and specialised market datasets is transforming how financial institutions source, analyse and operationalise information. Alongside traditional market data feeds from exchanges and major vendors, firms now consume a growing array of niche datasets supporting quantitative research, AI workflows and investment decision-making.

This surge in data consumption is creating new operational challenges. Managing the cost, usage and licensing obligations associated with an expanding data ecosystem is becoming increasingly complex, particularly as research teams experiment with new datasets outside traditional procurement channels.

Recent research from TRG Screen, the market data management solutions provider, highlights the scale of the issue. Firms adopting purpose-built governance tools have been able to significantly reduce operational overhead and data expenditure while improving visibility into how data is used across the organisation.

Yet the underlying challenge extends beyond cost management. As institutions integrate larger numbers of datasets into automated research and trading workflows, governance of data usage itself is emerging as a critical capability.

Market Data as Intellectual Property

At the core of the issue is the unique commercial structure of market data. Unlike many enterprise technology services, financial datasets are typically licensed as intellectual property with detailed restrictions governing how they may be accessed and used.

“Market data is fundamentally intellectual property,” says Nadine Scott, Chief Customer Strategy Officer at TRG Screen, in conversation with Market & Alt Data Insight. “What firms are buying is the right to consume and use IP, and that creates extremely complex commercial structures. Contracts often define exactly who can use the data, where it can be used and for what purpose. Managing that complexity has become a major operational challenge for financial institutions.”

That complexity becomes harder to manage as data usage spreads across more teams, models and workflows. In many cases, the analytical value of a dataset outweighs consideration of its commercial implications.

“Firms are primarily focused on the value of the data itself – its quality, frequency and how it can be integrated into key workflows such as trading or research,” Scott says. “The commercial aspects are often considered later. But as the number of datasets grows, organisations increasingly need a structured way to manage spend, usage and IP compliance together.”

The Governance Gap

The challenge becomes sharper as data acquisition grows more decentralised. Quant researchers and data science teams increasingly acquire datasets directly for experimentation, often through short-term subscriptions that sit outside traditional market data governance processes.

“You cannot manage what you cannot see,” Scott says. “In many organisations datasets are procured outside the central process, particularly when research teams need data quickly. Creating a single view of market data expenditure helps firms identify duplication, rationalise usage and eliminate waste.”

Scott argues that effective governance depends on combining visibility into spending with visibility into usage and licensing position.

“The key to managing market data effectively is triangulating three things: how much you spend, utilisation and compliance with licensing terms,” explains Scott. “When you can see those three dimensions together, you can begin to understand whether you are paying an amount that reflects the value of that use case.”

For many organisations, however, that remains difficult. Data consumption is often spread across multiple business units, applications and research environments, making it harder to track how datasets are actually used.

The Derived Data Challenge

The problem becomes more complex once datasets are combined, transformed and embedded in models. In modern quantitative workflows, raw datasets are rarely used directly; they are processed, merged with other sources and turned into signals or derived metrics that feed trading strategies and risk models.

“Within market data contracts you often have very specific restrictions on how derived data can be used or redistributed,” Scott says. “The challenge is that those licensing clauses are frequently buried in lengthy agreements, while the actual data usage happens elsewhere in the organisation. Bridging that gap is critical.”

Interpreting whether a derived signal complies with licensing restrictions can therefore be slow and error-prone, especially when the relevant terms are buried in unstructured agreements.

“Instead of teams manually reviewing agreements every time they want to use a dataset in a new way, we are developing AI-enabled solutions to extract the key licensing terms and make them queryable,” says Raushon Uddin, Product Director at TRG Screen. “The goal is that a user can simply ask, ‘Can I use this data for this purpose?’ and get an immediate answer.”

Making licensing terms more accessible at the point of use could reduce compliance risk while allowing research teams to move faster when testing new datasets.

Enabling Data Innovation

Firms are unlikely to slow their use of new datasets. In systematic and quantitative markets, access to differentiated data remains a competitive lever. The issue is not whether firms experiment, but how they do so without creating uncontrolled cost and compliance exposure.

“Market data teams should not be the people saying ‘you cannot do that’,” she says. “Their role should be to help the business navigate the vendor landscape with their expertise and add value to the businesses they support. But they can only do that if they are not buried in administrative work and can focus on higher-value activities.”

That means reducing the administrative burden around contracts, usage monitoring and compliance reporting so market data teams can focus on guidance, negotiation and strategy. Scott’s broader point is that governance should support data-driven innovation, not slow it down.

Governance in an AI-Driven Future

Looking ahead, the governance challenge is likely to intensify as AI-driven workflows become more widespread. Financial institutions are already experimenting with generative AI tools and automated research pipelines designed to accelerate data discovery and feature engineering. Over time, such systems could identify, evaluate and integrate new datasets with minimal human intervention.

“As AI-driven research workflows become faster and more automated, governance will also need to become more automated and policy-driven,” Uddin says. “Relying on manual approvals will become increasingly difficult when datasets are being tested and integrated at scale.”

In that environment, firms will need governance models that can keep pace with faster, more distributed data consumption. Managing the operational lifecycle of data, from procurement and licensing to usage and compliance, is becoming a strategic capability in its own right. Competitive advantage will depend not only on finding new sources of insight, but on governing their use with the same rigour as any other critical part of the data stack.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Parameta Solutions Launches Enhanced Real-Time OTC Oil Market Data Service

Parameta Solutions, the data and analytics division of TP ICAP Group, has launched an upgraded real-time data service designed to improve transparency in over-the-counter (OTC) oil trading. The service provides live, broker-sourced pricing from TP ICAP subsidiaries PVM and ICAP, with data from TP to be added later in October. Parameta claims that this makes...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...