About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Processing Strides Pave The Way To More TCA Capability

Subscribe to our newsletter

Squarepoint Capital’s expansion of its use of Kx Systems technology for investment research capitalises on the provider’s strength at normalising data, tying together data sources and increasing the speed of its data processing — in a manner that sets the stage for more effective transaction cost analysis (TCA).

Squarepoint uses Kx Systems’ Kdb+ time-series database and recently increased its consumption of large memory resources through Kdb+, according to Fintan Quill, head of software engineering, North America, KxSystems.

“Squarepoint wanted to access tons of historical market data, tying it in with their own internal order execution data for research purposes, to do TCA and back-testing — and discover various alpha patterns in the data,” says Quill.

MiFID II, the Fundamental Review of the Trading Book (FRTB) and other new rules for trading surveillance are making TCA capability a necessity. Kdb+ also serves as an underlying database and analytics engine for Thomson Reuters’ Enterprise Platform For Velocity Analytics, drawing on Thomson Reuters’ market data, as well as other market data sources, says Quill. “With this solution we’re actually tying two data sources together in one place,” he says. “This makes TCA very fast and very applicable. … It can now do real-time surveillance and sniff out the algorithms’ [activity in trading].”

With different trading desks often trying using different systems, all from a single firm, normalising the resulting data and putting that data into a simplified data model enables meaningful risk analysis and TCA surveillance, explains Quill.

Squarepoint’s increased use of Kdb+ fulfills its desire for faster access to data, according to Quill. “Now that memory has become a lot cheaper and a lot larger, people can share these environments,” he says. “You can have various quants all working on their own strategies and all hitting the same data set, but they don’t get in each other’s way.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

FCA Takes Charge: UK Centralises AML Supervision Across Professional Services

The United Kingdom’s decision to centralise Anti-Money Laundering (AML) and Counter-Terrorism Financing (CTF) supervision under the Financial Conduct Authority (FCA) marks a structural shift that brings professional services oversight in line with the rest of the financial sector. The move aligns the UK with a broader global trend toward consolidation, consistency, and intelligence-led supervision –...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...