About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Processing Strides Pave The Way To More TCA Capability

Subscribe to our newsletter

Squarepoint Capital’s expansion of its use of Kx Systems technology for investment research capitalises on the provider’s strength at normalising data, tying together data sources and increasing the speed of its data processing — in a manner that sets the stage for more effective transaction cost analysis (TCA).

Squarepoint uses Kx Systems’ Kdb+ time-series database and recently increased its consumption of large memory resources through Kdb+, according to Fintan Quill, head of software engineering, North America, KxSystems.

“Squarepoint wanted to access tons of historical market data, tying it in with their own internal order execution data for research purposes, to do TCA and back-testing — and discover various alpha patterns in the data,” says Quill.

MiFID II, the Fundamental Review of the Trading Book (FRTB) and other new rules for trading surveillance are making TCA capability a necessity. Kdb+ also serves as an underlying database and analytics engine for Thomson Reuters’ Enterprise Platform For Velocity Analytics, drawing on Thomson Reuters’ market data, as well as other market data sources, says Quill. “With this solution we’re actually tying two data sources together in one place,” he says. “This makes TCA very fast and very applicable. … It can now do real-time surveillance and sniff out the algorithms’ [activity in trading].”

With different trading desks often trying using different systems, all from a single firm, normalising the resulting data and putting that data into a simplified data model enables meaningful risk analysis and TCA surveillance, explains Quill.

Squarepoint’s increased use of Kdb+ fulfills its desire for faster access to data, according to Quill. “Now that memory has become a lot cheaper and a lot larger, people can share these environments,” he says. “You can have various quants all working on their own strategies and all hitting the same data set, but they don’t get in each other’s way.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

Inside the Uneven Geography of AML Enforcement Outcomes in 2025 – Fenergo Analysis

Fenergo’s latest Global enforcement analysis shows total AML, KYC, sanctions and customer due diligence penalties declining to $3.8 billion in 2025, down from $4.6 billion in 2024 and $6.6 billion in 2023, marking a second consecutive year of decline. Beneath that headline, regional outcomes moved in sharply different directions. North American fines fell by 58%,...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 9th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Preparing For Primetime – How to Benefit from the Global LEI

They say time flies when you’re enjoying yourself, and so it seems the industry have been having a blast with its preparations for the introduction of the global legal entity identifier (LEI) next month. But now it’s time to get serious. To date, much of the industry debate has centred on the identifier itself: its...