About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

QuantHouse Boosts Backbone Capacity to 100G Using Arista

Subscribe to our newsletter

Iress’s QuantHouse market data and infrastructure subsidiary has upgraded its data centre backbone connections globally to 100 Gigabit capacity to address the rigours of the ongoing market volatility. The upgrade is part of a wider infrastructure enhancement aimed at meeting three business objectives designed to enhance performance and resilience while adding customer self-service capabilities.

As part of the upgrades, QuantHouse has deployed Arista Networks’ low-latency platform solution, which supports a significant increase in capacity without compromising real-time network services, such as time-stamping and time-source synchronisation. Arista acquired financial network performance specialist Metamako three years ago to boost its offerings in this segment.

The QuantHouse initiative – aimed at ensuring wider bandwidth and enhanced performance for clients – builds on an earlier infrastructure process automation programme, launched in April, to address growing message volumes. According to Emmanuel Carjat, Chief Operating Officer, QuantHouse, improving the performance of the QuantHouse fabric will allow the company to add new capabilities that address security and risk, as well as volatility changes.

At the core of the enhancements is the 100 Gigabit data centre backbone upgrade, which reduces the risk of packet loss and ensures that the QuantHouse infrastructure continues to meet the significant and on-going increase in bandwidth requirements from exchanges. Says Carjat: “In order to ensure that this new deployment is fully automated, we have leveraged Arista AVD. Using a mix of in house DevOps technologies and Arista’s ansible playbooks we are able to provision new services with minimal human involvement thereby accelerating the time to market of our customers and reducing human errors.”

Deploying the Arista Networks platform improves latency for customers, and by implementing a highly scalable EVPN/VxLAN design, QuantHouse is able to deliver new services more quickly. Mark Foss, Senior Vice President Global Operations and Marketing at Arista Networks, says: “the Arista EOS and Cloudvision set of APIs and automation framework provide the attributes to deliver the ‘Customer Self-Service’ solution provided by QuantHouse. The network automation framework enables QuantHouse engineers to out-task standard maintenance and client implementation processes.”

Foss adds that the key attributes of the Arista platform provide the foundation for supporting QuantHouse infrastructure process automation initiative, allowing the company to deliver superior performance and resilience while growing their client base.

“Introducing increased levels of automation enables QuantHouse to rapidly add resources to a number of their in-house processes with minimal human intervention,” he says, “using widely adopted and secure cloud native technologies that deliver automated deployment and monitoring tasks. This infrastructure process automation initiative also ensures that QuantHouse is well positioned to expand easily into new markets that trade 24×7.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

High Performance Technologies for Trading

The highly specialised realm of high frequency trading without doubt is a great driver for a range of high performance technologies that are becoming essential tools for Wall Street. More so than the now somewhat pedestrian algorithmic trading and analytics/pricing applications that are usually cited as the reason that HPC is hitting the financial markets,...