About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Processing Strides Pave The Way To More TCA Capability

Subscribe to our newsletter

Squarepoint Capital’s expansion of its use of Kx Systems technology for investment research capitalises on the provider’s strength at normalising data, tying together data sources and increasing the speed of its data processing — in a manner that sets the stage for more effective transaction cost analysis (TCA).

Squarepoint uses Kx Systems’ Kdb+ time-series database and recently increased its consumption of large memory resources through Kdb+, according to Fintan Quill, head of software engineering, North America, KxSystems.

“Squarepoint wanted to access tons of historical market data, tying it in with their own internal order execution data for research purposes, to do TCA and back-testing — and discover various alpha patterns in the data,” says Quill.

MiFID II, the Fundamental Review of the Trading Book (FRTB) and other new rules for trading surveillance are making TCA capability a necessity. Kdb+ also serves as an underlying database and analytics engine for Thomson Reuters’ Enterprise Platform For Velocity Analytics, drawing on Thomson Reuters’ market data, as well as other market data sources, says Quill. “With this solution we’re actually tying two data sources together in one place,” he says. “This makes TCA very fast and very applicable. … It can now do real-time surveillance and sniff out the algorithms’ [activity in trading].”

With different trading desks often trying using different systems, all from a single firm, normalising the resulting data and putting that data into a simplified data model enables meaningful risk analysis and TCA surveillance, explains Quill.

Squarepoint’s increased use of Kdb+ fulfills its desire for faster access to data, according to Quill. “Now that memory has become a lot cheaper and a lot larger, people can share these environments,” he says. “You can have various quants all working on their own strategies and all hitting the same data set, but they don’t get in each other’s way.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

Celebrating Excellence at the TradingTech Insight Awards Europe 2026

The pace of change across trading technology shows no sign of slowing. As markets become more complex, data-intensive and performance-driven, firms are rethinking how infrastructure, analytics and execution workflows interconnect across the trading lifecycle. Against this backdrop, the TradingTech Insight Awards Europe 2026 brought the industry together to recognise the solution providers delivering measurable impact...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...