About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

QuantHouse Offers Historical Data on-Demand to Algo Traders

Subscribe to our newsletter

QuantHouse has released Historical Data on-Demand, a service designed to speed up the research, development and back-testing phase of any trading strategy, and allow clients to implement new trading ideas within days rather than weeks or months.

The company is offering up to 10 years of historical data on-demand for the US, European and Asia-Pacific markets. Access to the data is available via a web portal, so clients can search for the data they need and purchase it online using a web browser of choice. The historical datasets purchased are delivered as flat files and are available for immediate integration into any system, without the need to integrate an API. Historical data can be replayed over prior time periods with the results being refined and adjusted to optimise trading performance.

While the time taken to fulfil the research, development and back-testing cycle of a trade can push execution beyond optimal timings, QuantHouse says giving research and development teams Historical Data on-Demand will enable them to rapidly test new and current trading strategies, and detect potential losses or degradation of the strategies within days, not weeks.

Stephane Leroy, chief revenue officer and co-founder of QuantHouse, explains: “The trading landscape has changed significantly in the past few years, it is no longer about how fast your trades are sent, but how quickly your trading strategy can be ready. To move away from speed trading to smart trading, you need access to trusted, reliable and consistent data on-demand, so that you can spot changes and emerging patterns in the market quickly and evaluate and adjust your trading strategy accordingly. Our Historical Data on-Demand service gives clients an advantage by moving them into a much more real-time environment.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

LSEG and Bank of America Target AI-ready, Governed Data Integration in Multi-Year Partnership

London Stock Exchange Group (LSEG) and Bank of America have agreed a multi-year strategic partnership centred on embedding governed, AI-ready data and analytics directly into the bank’s core workflows. Rather than a distribution agreement focused on access, the collaboration reflects a broader architectural shift: integrating unified, rights-cleared content, analytics and risk intelligence across advisory, trading,...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...