About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Devil is in The Data: The Role of FMIs in Providing Data Services and the Importance of a Best Practice Approach

Subscribe to our newsletter

By Tim Lind, Managing Director, Data Services at DTCC

In recent years, it has been claimed that data has eclipsed oil as the world’s most valuable resource. Financial market infrastructures (FMIs) are on a constant search for ‘new oil’ and the value of data is certainly on the list of new services they are developing. Post crisis reforms and the requirement for greater transparency in financial markets through new transaction and trade reporting rules has led to a much larger volume of data being captured within the financial system.

However, while there’s no shortage of data, what the industry really needs is insights. Therefore, the challenge for institutions collectively is to harness the millions of transactions that flow through their infrastructures and create actionable information that will enhance decision-making at all levels. FMIs play an important role in the provisioning of data for the industry and a best practice approach must be adopted to ensure standards of data quality, confidentiality and security are maintained.

FMIs are naturally intermediated in transaction flow and can play a critical role in capturing, aggregating and discovering value in the data assets that flow through their services, and return that value back to the participants who use that infrastructure. This involves developing innovative and unique data products that provide insight into areas such as market risk, liquidity assessment, trade decision support, capital management models in preparation for constantly moving frameworks – such as the upcoming Fundamental Review of the Trading Book (FRTB) requirements, benchmark valuation and trade data – to support alternatives for benchmark rates to replace LIBOR.

Historical, aggregated and anonymised transaction data provide the baseline for quantitative and analytical models that consider patterns of liquidity and trade activity and can facilitate more effective decision making, which can lead to improved trading, asset allocation, price discovery, client service, collateral management and risk management. Due to the wide-ranging applications of historical transaction data, the opportunity to create value for the community of participants in the infrastructure is very significant. While transaction data are a historical record of what happened in capital markets, marrying them to other economic data can help develop predictive analytics, which is the holy grail of value-added data services.

That said, while FMIs host data, this does not mean they are by default data providers. First, there is a significant amount of data management technology and infrastructure, which is required to govern the process of data provisioning. This includes procedures to extract, aggregate, normalise, curate, store, encrypt, entitle, publish and support data services. Second, data services require an invoicing process, a legal/contractual infrastructure, as well as a function that will continue to develop and innovate them.

The availability of transformative technologies, including artificial intelligence (AI), cloud and machine learning, is creating opportunities while lowering the cost and technical challenges of bringing new content services to market. Cloud enables the ability to store massive amounts of data in environments where AI can enable quick execution of regressions that discover patterns, outliers and relationships in large data sets. AI is largely dependent on access to normalised and consistent historical data to support predictive models.

Clearing and settlement infrastructure are natural aggregation points for the highest quality data available, and the transactions that flow through infrastructure services are essentially the historical record and barometer of capital markets. Combined with AI tools and the skills and imagination of data scientists, historical data provides the foundation of probability, prediction and indeed, the next generation of alpha creation.

In those cases where the data provider follows strict guidelines on governance and ensures that the data and its supporting infrastructure are of the requisite quality, the data which they provide can help market participants make more informed decisions.

Ultimately, data services are built on a foundation of trust, and for data providers to be successful it is critical to establish and maintain that trust with all market participants. To ensure this marriage, appropriate aggregation and anonymity rules should be applied.

It is crucial that the proprietary investment or trading strategy of any entity is not divulged or inferred by allowing proprietary information to be reverse engineered. Capital markets is an industry based on data and predictions. The value created for investors by financial services institutions is the core of this industry, therefore an appropriate balance must be struck between transparency and the protection of proprietary work of any individual firm.

Post crisis, regulation and the focus on increased transparency through transaction and trade reporting have led to a surge in the volume of data available to market participants. As a result, FMIs have an increasingly important role in deriving value from the data assets that flow through their services and delivering that value to market participants who use the infrastructures. Critical to their continued success is the ability to follow a best practice approach to data governance as well as privacy and proprietary issues.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

6 November 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results — and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and...

BLOG

Denodo says its New Agora Platform is Built to Tame New Data Challenges

Data management services provider Denodo says it has tooled its latest innovation to provide solutions for a four-prong set of new challenges that the changing face of financial services has presented institutions. Agora, the company’s new cloud-based data management platform, addresses the critical needs that the Californian company highlighted in a recent white paper. They...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...