About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

TBricks Taps QuantLINK and QuantFEED for Hosted Trading Service

Subscribe to our newsletter

Trading system TBricks has rolled out a managed, hosted variant of its automated trading software, and has partnered with S&P Capital IQ to provide market connectivity and data. TBricks OnDemand has been rolled out in Stockholm, London, Frankfurt and Chicago.

TBricks provides tools to allow manual and low-latency automated trading in the global markets, including options and ETFs, supporting a number of different strategies, including HFT. To date, its offering has been deployed as enterprise software in a customer or co-lo data centre, leveraging Oracle’s Solaris operating system and Berkeley DB database, running on Intel-based servers.

With TBricks OnDemand, customers do not need to deploy any software or build the hardware platform and market connectivity on which to run it.  Rather, the TBricks functionality is provided via a network connection, possibly within the same co-lo data centre.

TBricks is providing market connectivity and data via its relationship with S&P Capital IQ. Specifically, it is using QuantLINK to provide trading access to global markets, and QuantFEED to access normalised market data. These products have been offered by S&P since its early 2012 acquisition of QuantHouse.

As trading firms increasingly look to make use of hosted, managed services, the providers of those services will also seek to tie up with existing connectivity and market data providers to themselves bring their services to market more rapidly, and to lower ongoing operations costs.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

DiffusionData Targets Agentic AI in Finance with New MCP Server

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Fatca – Getting to Grips with the Challenge Ahead

The industry breathed a sigh of relief when the deadline for reporting under the US Foreign Account Tax Compliance Act (Fatca) was pushed back to July 1, 2014. But what’s starting to look like perhaps the most significant regulation of the next 12 months may start to impact our marketplace sooner than we think, especially...