About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Processing Strides Pave The Way To More TCA Capability

Subscribe to our newsletter

Squarepoint Capital’s expansion of its use of Kx Systems technology for investment research capitalises on the provider’s strength at normalising data, tying together data sources and increasing the speed of its data processing — in a manner that sets the stage for more effective transaction cost analysis (TCA).

Squarepoint uses Kx Systems’ Kdb+ time-series database and recently increased its consumption of large memory resources through Kdb+, according to Fintan Quill, head of software engineering, North America, KxSystems.

“Squarepoint wanted to access tons of historical market data, tying it in with their own internal order execution data for research purposes, to do TCA and back-testing — and discover various alpha patterns in the data,” says Quill.

MiFID II, the Fundamental Review of the Trading Book (FRTB) and other new rules for trading surveillance are making TCA capability a necessity. Kdb+ also serves as an underlying database and analytics engine for Thomson Reuters’ Enterprise Platform For Velocity Analytics, drawing on Thomson Reuters’ market data, as well as other market data sources, says Quill. “With this solution we’re actually tying two data sources together in one place,” he says. “This makes TCA very fast and very applicable. … It can now do real-time surveillance and sniff out the algorithms’ [activity in trading].”

With different trading desks often trying using different systems, all from a single firm, normalising the resulting data and putting that data into a simplified data model enables meaningful risk analysis and TCA surveillance, explains Quill.

Squarepoint’s increased use of Kdb+ fulfills its desire for faster access to data, according to Quill. “Now that memory has become a lot cheaper and a lot larger, people can share these environments,” he says. “You can have various quants all working on their own strategies and all hitting the same data set, but they don’t get in each other’s way.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best Practices for Managing Trade Surveillance

The surge in trading volumes combined with the emergence of new digital financial assets and geopolitical events have added layers of complexity to market activities. Traditional surveillance methods often struggle to keep pace with these changes, leading to difficulties in detecting sophisticated market abuses and increased regulatory risk. To address these challenges, financial institutions are...

BLOG

Don’t Forget People and Process when Deploying Agentic AI

When the financial industry talks ‘agentic AI’, there’s a tendency for the conversation to quickly devolve into cutting-edge technologies – large language models (LLMs), neural networks, generative algorithms (GenAI) etc. Agentic AI is really about transforming the business processes that define firms’ operations and the roles that supervise them. Success is dependent on more than...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...