About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DTCC Data Services Aggregates Back-Office Data to Provide Front-Office Market Insight

Subscribe to our newsletter

DTCC Data Services is tapping the vast quantities of post-trade data that flow through DTCC on a daily basis to deliver back-office data to front-office data products offering market insight into areas such as liquidity, risk, momentum, sentiment and correlation. Early users of the historical data products include hedge funds and quant traders looking for alternative sources of information, although other market participants are expected to follow.

Tim Lind, managing director of DTCC Data Services, says: “The back office has been about automation and digitalisation of the trade lifecycle and its business use case has been based on cost and better client service. Until more recently, firms didn’t really value back-office data beyond operational insights. Now, the by-product of digitalising more post-trade processes is that we have a historical record of every trade in the back office. These records can be aggregated and anonymised to provide insights for the front and middle office, and the back office becomes an important partner to the front office.”

The data products resulting from digitalisation fall under DTCC’s Kinetics brand of trade activity-based data solutions that include Equity Kinetics, which enables users to view, track and analyse aggregated US equities trade volumes over time and in the context of broader market activity, and CDS Kinetics, which uses position-data records from DTCC’s Trade Information Warehouse to deliver enriched data sets to subscribers.

Lind notes that ‘each asset class is on its own journey and has its own mechanisms’, and suggests further use cases of historical data in the front office such as the valuation of fixed income products, and the provision of empirical data in money markets following migration from LIBOR rates. He comments: “The future of US interest rate indexation will be primarily based on empirical trade data that flows through DTCC.”

The historical data includes security identifiers, volume information and volume weighted average price (VWAP) analytics, although Lind draws the line here on analytics, saying: “We present raw data and users employ their proprietary magic to create real insights. The inclusion of VWAP is an illustration of what can be done with the data.”

The data is normalised, anonymised and aggregated, and provides context to answer questions such as ‘what is the activity of a particular security?’. It is of high quality as it is sourced from bona fide transactions that have been matched and moved to clearing and settlement, and is delivered in a lump sum and added to on a daily basis. It is offered in FTP files or through APIs, and users can also access particular attributes from the daily files. Pricing is based on DTCC Data Services’ fee-based business model.

Lind concludes: “This is all about our trade observations. We want to explore and expand in different asset classes. For example, exchange traded funds – ETFs – and secondary trading on ETFs are big for us, but the growth of ETFs has caused challenges around pricing. We want to provide more transparency here.” The company is also planning to release a data product providing transparency around repurchase arrangements (repos) collateralised through the US Treasury before the end of the year.

Subscribe to our newsletter

Related content


Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...


EU Parliament Approves Landmark Artificial Intelligence Act

The EU Parliament has approved the Artificial Intelligence Act, marking the world’s first regulation of AI. The regulation establishes obligations for AI based on its potential risks and level of impact and is designed to ensure safety and compliance with fundamental rights, democracy, the rule of law and environmental sustainability, while boosting innovation. The act...


RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Entity Data Management & the LEI

Just over a year since the Financial Stability Board handed over leadership and direction of the interim Global Legal Entity Identifier System – or GLEIS – to the Regulatory Oversight Committee (ROC) of the LEI the entity identifier is being used for reporting under European Market Infrastructure Regulation. This report discusses recent developments in the...