About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Processing Strides Pave The Way To More TCA Capability

Subscribe to our newsletter

Squarepoint Capital’s expansion of its use of Kx Systems technology for investment research capitalises on the provider’s strength at normalising data, tying together data sources and increasing the speed of its data processing — in a manner that sets the stage for more effective transaction cost analysis (TCA).

Squarepoint uses Kx Systems’ Kdb+ time-series database and recently increased its consumption of large memory resources through Kdb+, according to Fintan Quill, head of software engineering, North America, KxSystems.

“Squarepoint wanted to access tons of historical market data, tying it in with their own internal order execution data for research purposes, to do TCA and back-testing — and discover various alpha patterns in the data,” says Quill.

MiFID II, the Fundamental Review of the Trading Book (FRTB) and other new rules for trading surveillance are making TCA capability a necessity. Kdb+ also serves as an underlying database and analytics engine for Thomson Reuters’ Enterprise Platform For Velocity Analytics, drawing on Thomson Reuters’ market data, as well as other market data sources, says Quill. “With this solution we’re actually tying two data sources together in one place,” he says. “This makes TCA very fast and very applicable. … It can now do real-time surveillance and sniff out the algorithms’ [activity in trading].”

With different trading desks often trying using different systems, all from a single firm, normalising the resulting data and putting that data into a simplified data model enables meaningful risk analysis and TCA surveillance, explains Quill.

Squarepoint’s increased use of Kdb+ fulfills its desire for faster access to data, according to Quill. “Now that memory has become a lot cheaper and a lot larger, people can share these environments,” he says. “You can have various quants all working on their own strategies and all hitting the same data set, but they don’t get in each other’s way.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Hearing from the Experts: AI Governance Best Practices

The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical and legal use of external information. Robust data governance frameworks provide the guardrails needed...

BLOG

Don’t Forget People and Process when Deploying Agentic AI

When the financial industry talks ‘agentic AI’, there’s a tendency for the conversation to quickly devolve into cutting-edge technologies – large language models (LLMs), neural networks, generative algorithms (GenAI) etc. Agentic AI is really about transforming the business processes that define firms’ operations and the roles that supervise them. Success is dependent on more than...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...