About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

LiquidityBook Moves Infrastructure into the AWS Cloud

Subscribe to our newsletter

LiquidityBook has joined the Amazon Web Services (AWS) community having completed the migration of its Software-as-a-Service (SaaS) buy- and sell-side trading solutions to the cloud provider’s global data centres. As a result, the company has points of presence in AWS regions in the US and Europe, and ability to scale up globally across Europe, the US, Asia-Pacific and LatAm as client need arises.

LiquidityBook started using AWS for some infrastructure components when it moved to a fully SaaS based model with the 2013 release of its next generation LBX suite. Earlier this year, it began a project to move to a full Infrastructure-as-a-Service (IaaS) model, migrating its entire infrastructure to the cloud. The company’s solutions include order management, portfolio management, execution management, FIX network connectivity, compliance and pre- and post-trade processing.

The decision to move to AWS was made, in part, to meet growth in client wins and to be able to spin up additional data centres in response to regional client demand for LiquidityBook services.

The move also delivers technical benefits. LiquidityBook chief architect Andy Carroll, who was brought on earlier this year to lead the AWS migration effort, says: “We were an early adopter of the web for both the front- and back-end of our platform for multiple reasons – simplicity, extensibility, flexibility and scalability to name a few. Amazon has been a fantastic partner for us since we developed our next-gen platform, and we’re happy to have moved our infrastructure to it to create a resilient data centre mesh globally.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Applications of Reference Data to the Middle Office

Increasing volumes and the complexity of reference data in the post-crisis environment have left the middle office struggling to meet the requirements of the current market order. Middle office functions must therefore be robust enough to be able to deal with the spectre of globalisation, an increase in the use of esoteric security types and...