About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DTCC Data Services Aggregates Back-Office Data to Provide Front-Office Market Insight

Subscribe to our newsletter

DTCC Data Services is tapping the vast quantities of post-trade data that flow through DTCC on a daily basis to deliver back-office data to front-office data products offering market insight into areas such as liquidity, risk, momentum, sentiment and correlation. Early users of the historical data products include hedge funds and quant traders looking for alternative sources of information, although other market participants are expected to follow.

Tim Lind, managing director of DTCC Data Services, says: “The back office has been about automation and digitalisation of the trade lifecycle and its business use case has been based on cost and better client service. Until more recently, firms didn’t really value back-office data beyond operational insights. Now, the by-product of digitalising more post-trade processes is that we have a historical record of every trade in the back office. These records can be aggregated and anonymised to provide insights for the front and middle office, and the back office becomes an important partner to the front office.”

The data products resulting from digitalisation fall under DTCC’s Kinetics brand of trade activity-based data solutions that include Equity Kinetics, which enables users to view, track and analyse aggregated US equities trade volumes over time and in the context of broader market activity, and CDS Kinetics, which uses position-data records from DTCC’s Trade Information Warehouse to deliver enriched data sets to subscribers.

Lind notes that ‘each asset class is on its own journey and has its own mechanisms’, and suggests further use cases of historical data in the front office such as the valuation of fixed income products, and the provision of empirical data in money markets following migration from LIBOR rates. He comments: “The future of US interest rate indexation will be primarily based on empirical trade data that flows through DTCC.”

The historical data includes security identifiers, volume information and volume weighted average price (VWAP) analytics, although Lind draws the line here on analytics, saying: “We present raw data and users employ their proprietary magic to create real insights. The inclusion of VWAP is an illustration of what can be done with the data.”

The data is normalised, anonymised and aggregated, and provides context to answer questions such as ‘what is the activity of a particular security?’. It is of high quality as it is sourced from bona fide transactions that have been matched and moved to clearing and settlement, and is delivered in a lump sum and added to on a daily basis. It is offered in FTP files or through APIs, and users can also access particular attributes from the daily files. Pricing is based on DTCC Data Services’ fee-based business model.

Lind concludes: “This is all about our trade observations. We want to explore and expand in different asset classes. For example, exchange traded funds – ETFs – and secondary trading on ETFs are big for us, but the growth of ETFs has caused challenges around pricing. We want to provide more transparency here.” The company is also planning to release a data product providing transparency around repurchase arrangements (repos) collateralised through the US Treasury before the end of the year.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Real world data governance – Practical strategies for data ownership

The theories of data governance and ownership are well rehearsed. Essentially, data governance includes rules and processes that make data accurate, compliant and accessible, ensuring the right users can access trusted data as and when they need it. Data ownership assigns responsibility and accountability for a specific dataset to an individual or team that can...

BLOG

Rimes Releases Data lakehouse Designed to Provide Insights from Diverse Data Sources

Rimes has released the Rimes data lakehouse, a service that combines the advantages of a data lake and a data warehouse to enable asset managers and owners to quickly access structured and unstructured data and derive valuable insights from diverse data sources. The lakehouse comprises an advanced data storage, processing and distribution platform delivered as...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...