About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Devil is in The Data: The Role of FMIs in Providing Data Services and the Importance of a Best Practice Approach

Subscribe to our newsletter

By Tim Lind, Managing Director, Data Services at DTCC

In recent years, it has been claimed that data has eclipsed oil as the world’s most valuable resource. Financial market infrastructures (FMIs) are on a constant search for ‘new oil’ and the value of data is certainly on the list of new services they are developing. Post crisis reforms and the requirement for greater transparency in financial markets through new transaction and trade reporting rules has led to a much larger volume of data being captured within the financial system.

However, while there’s no shortage of data, what the industry really needs is insights. Therefore, the challenge for institutions collectively is to harness the millions of transactions that flow through their infrastructures and create actionable information that will enhance decision-making at all levels. FMIs play an important role in the provisioning of data for the industry and a best practice approach must be adopted to ensure standards of data quality, confidentiality and security are maintained.

FMIs are naturally intermediated in transaction flow and can play a critical role in capturing, aggregating and discovering value in the data assets that flow through their services, and return that value back to the participants who use that infrastructure. This involves developing innovative and unique data products that provide insight into areas such as market risk, liquidity assessment, trade decision support, capital management models in preparation for constantly moving frameworks – such as the upcoming Fundamental Review of the Trading Book (FRTB) requirements, benchmark valuation and trade data – to support alternatives for benchmark rates to replace LIBOR.

Historical, aggregated and anonymised transaction data provide the baseline for quantitative and analytical models that consider patterns of liquidity and trade activity and can facilitate more effective decision making, which can lead to improved trading, asset allocation, price discovery, client service, collateral management and risk management. Due to the wide-ranging applications of historical transaction data, the opportunity to create value for the community of participants in the infrastructure is very significant. While transaction data are a historical record of what happened in capital markets, marrying them to other economic data can help develop predictive analytics, which is the holy grail of value-added data services.

That said, while FMIs host data, this does not mean they are by default data providers. First, there is a significant amount of data management technology and infrastructure, which is required to govern the process of data provisioning. This includes procedures to extract, aggregate, normalise, curate, store, encrypt, entitle, publish and support data services. Second, data services require an invoicing process, a legal/contractual infrastructure, as well as a function that will continue to develop and innovate them.

The availability of transformative technologies, including artificial intelligence (AI), cloud and machine learning, is creating opportunities while lowering the cost and technical challenges of bringing new content services to market. Cloud enables the ability to store massive amounts of data in environments where AI can enable quick execution of regressions that discover patterns, outliers and relationships in large data sets. AI is largely dependent on access to normalised and consistent historical data to support predictive models.

Clearing and settlement infrastructure are natural aggregation points for the highest quality data available, and the transactions that flow through infrastructure services are essentially the historical record and barometer of capital markets. Combined with AI tools and the skills and imagination of data scientists, historical data provides the foundation of probability, prediction and indeed, the next generation of alpha creation.

In those cases where the data provider follows strict guidelines on governance and ensures that the data and its supporting infrastructure are of the requisite quality, the data which they provide can help market participants make more informed decisions.

Ultimately, data services are built on a foundation of trust, and for data providers to be successful it is critical to establish and maintain that trust with all market participants. To ensure this marriage, appropriate aggregation and anonymity rules should be applied.

It is crucial that the proprietary investment or trading strategy of any entity is not divulged or inferred by allowing proprietary information to be reverse engineered. Capital markets is an industry based on data and predictions. The value created for investors by financial services institutions is the core of this industry, therefore an appropriate balance must be struck between transparency and the protection of proprietary work of any individual firm.

Post crisis, regulation and the focus on increased transparency through transaction and trade reporting have led to a surge in the volume of data available to market participants. As a result, FMIs have an increasingly important role in deriving value from the data assets that flow through their services and delivering that value to market participants who use the infrastructures. Critical to their continued success is the ability to follow a best practice approach to data governance as well as privacy and proprietary issues.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Arcesium wins Most Innovative North American Data Management Provider in A-Team Group Data Management Awards USA 2023

Arcesium has won the award for Most Innovative North American Data Management Provider in A-Team Group’s Data Management Insight Awards USA 2023. The awards celebrate leading providers of data management solutions, services and consultancy to capital markets participants. Arcesium was selected as a winner by A-Team Group’s US data management community. Mahesh Narayan, Institutional Asset...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...