A-Team Insight Blogs

The Devil is in The Data: The Role of FMIs in Providing Data Services and the Importance of a Best Practice Approach

Share article

By Tim Lind, Managing Director, Data Services at DTCC

In recent years, it has been claimed that data has eclipsed oil as the world’s most valuable resource. Financial market infrastructures (FMIs) are on a constant search for ‘new oil’ and the value of data is certainly on the list of new services they are developing. Post crisis reforms and the requirement for greater transparency in financial markets through new transaction and trade reporting rules has led to a much larger volume of data being captured within the financial system.

However, while there’s no shortage of data, what the industry really needs is insights. Therefore, the challenge for institutions collectively is to harness the millions of transactions that flow through their infrastructures and create actionable information that will enhance decision-making at all levels. FMIs play an important role in the provisioning of data for the industry and a best practice approach must be adopted to ensure standards of data quality, confidentiality and security are maintained.

FMIs are naturally intermediated in transaction flow and can play a critical role in capturing, aggregating and discovering value in the data assets that flow through their services, and return that value back to the participants who use that infrastructure. This involves developing innovative and unique data products that provide insight into areas such as market risk, liquidity assessment, trade decision support, capital management models in preparation for constantly moving frameworks – such as the upcoming Fundamental Review of the Trading Book (FRTB) requirements, benchmark valuation and trade data – to support alternatives for benchmark rates to replace LIBOR.

Historical, aggregated and anonymised transaction data provide the baseline for quantitative and analytical models that consider patterns of liquidity and trade activity and can facilitate more effective decision making, which can lead to improved trading, asset allocation, price discovery, client service, collateral management and risk management. Due to the wide-ranging applications of historical transaction data, the opportunity to create value for the community of participants in the infrastructure is very significant. While transaction data are a historical record of what happened in capital markets, marrying them to other economic data can help develop predictive analytics, which is the holy grail of value-added data services.

That said, while FMIs host data, this does not mean they are by default data providers. First, there is a significant amount of data management technology and infrastructure, which is required to govern the process of data provisioning. This includes procedures to extract, aggregate, normalise, curate, store, encrypt, entitle, publish and support data services. Second, data services require an invoicing process, a legal/contractual infrastructure, as well as a function that will continue to develop and innovate them.

The availability of transformative technologies, including artificial intelligence (AI), cloud and machine learning, is creating opportunities while lowering the cost and technical challenges of bringing new content services to market. Cloud enables the ability to store massive amounts of data in environments where AI can enable quick execution of regressions that discover patterns, outliers and relationships in large data sets. AI is largely dependent on access to normalised and consistent historical data to support predictive models.

Clearing and settlement infrastructure are natural aggregation points for the highest quality data available, and the transactions that flow through infrastructure services are essentially the historical record and barometer of capital markets. Combined with AI tools and the skills and imagination of data scientists, historical data provides the foundation of probability, prediction and indeed, the next generation of alpha creation.

In those cases where the data provider follows strict guidelines on governance and ensures that the data and its supporting infrastructure are of the requisite quality, the data which they provide can help market participants make more informed decisions.

Ultimately, data services are built on a foundation of trust, and for data providers to be successful it is critical to establish and maintain that trust with all market participants. To ensure this marriage, appropriate aggregation and anonymity rules should be applied.

It is crucial that the proprietary investment or trading strategy of any entity is not divulged or inferred by allowing proprietary information to be reverse engineered. Capital markets is an industry based on data and predictions. The value created for investors by financial services institutions is the core of this industry, therefore an appropriate balance must be struck between transparency and the protection of proprietary work of any individual firm.

Post crisis, regulation and the focus on increased transparency through transaction and trade reporting have led to a surge in the volume of data available to market participants. As a result, FMIs have an increasingly important role in deriving value from the data assets that flow through their services and delivering that value to market participants who use the infrastructures. Critical to their continued success is the ability to follow a best practice approach to data governance as well as privacy and proprietary issues.

Leave a comment

Your email address will not be published. Required fields are marked *

*

Related content

WEBINAR

Upcoming Webinar: How to ensure effective trade surveillance to meet the demands of regulatory scrutiny and compliance

Date: 30 January 2020 Time: 10:00am ET / 3:00pm London / 4:00pm CET Accurate, scalable and defensible trade surveillance is key to compliance with regulations such as Market Abuse Regulation (MAR). But how best can it be achieved, what types of technologies are effective, how can false positives be reduced, and where should responsibility lie...

BLOG

Managing Regulatory Change and Driving Efficiencies – A Debate

As the volume of regulation continues to increase, with regulators not always managing to achieve consistency across frameworks and jurisdictions, how can financial institutions keep up with the ever-changing requirements – and match compliance needs that can sometimes feel conflicting? In one of the most popular panels of the day at the recent RegTech Summit...

EVENT

Breakfast Briefing: Meeting the Data Requirements of FRTB London

The Fundamental Review of the Trading Book (FRTB) Breakfast Briefing, will examine how the capital markets industry is approaching FRTB data management and will look at the implications for the ways that firms source, manage and store data for FRTB compliance.

GUIDE

Regulatory Data Handbook 2019/2020 – Seventh Edition

Welcome to A-Team Group’s best read handbook, the Regulatory Data Handbook, which is now in its seventh edition and continues to grow in terms of the number of regulations covered, the detail of each regulation and the impact that all the rules and regulations will have on data and data management at your institution. This...