About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Chronicle Integrates Latency-Optimised Messaging Framework with KX’s kdb+

Subscribe to our newsletter

London-based low-latency technology specialist Chronicle Software has developed an off-the-shelf integration between its Chronicle Queue messaging framework and KX’s kdb+ high-performance database in partnership with consultancy firm AquaQ.

According to Chronicle CEO Peter Lawrey, over 80% of the Top 100 banks Globally use the company’s Enterprise or Open Source products. Many of these banks are also kdb+ users. The new offering will provide users with a ready-made integration between the two products, something that firms have previously had to build and maintain themselves.

A common use case is where firms need to speed up kdb+ writes, says Lawrey. “They want to be able to firehose data to kdb+ and then have kdb+ pick up that data as quickly as it can. kdb+ processes data a lot quicker if it’s batched, rather than just taking a message at a time. Essentially, clients are using Queue as a big, high-performance buffer, so they can process their data quicker within kdb+”.

Lawrey explains that integrating with third-party platforms such as kdb+ is one of the company’s three main focus areas for the company’s 2021 growth strategy. “We recognise that there is always work to be done by clients when integrating with third-party products. So part of our strategy this year is to provide some pre-built connectors for the most common third-party products we see our clients using.” Potential upcoming integration projects include the Spring Boot framework, Eclipse’s Vert.x toolkit and Apache’s Camel open-source integration framework, Lawrey says.

Another area of focus is in helping firms break down their monolithic applications into microservices, for easier migration to the cloud. “Firms are using Queue to implement their microservices infrastructure, because it’s a very low latency, persistent messaging framework,” he says. “One of the problems with microservices is the overhead in breaking up a monolith into multiple services. We keep that overhead very low and improve performance by breaking up the microservices into much more manageable and tunable chunks.”

The third focus area for Chronicle this year is visualisation. “What we do can be pretty abstract,” says Lawrey. “For example, we care about latency, and often firms can’t see visually how fast their systems are running. Adding visualisation tools will allow our clients to construct and manage latency monitoring services across multiple machines via a GUI, without having to write code.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: High-Performance Networks & Low-Latency Connectivity for Trading

With financial markets becoming more complex and interconnected in today’s electronic trading environment, trading firms, exchanges, and infrastructure providers need to continually push the boundaries of network performance to stay ahead. Ultra-low latency, seamless connectivity, and resilient infrastructure are no longer just advantages – to stay competitive, they’re necessities. This webinar, part of the A-Team...

BLOG

South Africa High on the List for Global HFT Firms

By Merlin Rajah, Head: Equities Electronic Product at Absa CIB. Infrastructure Evolution: JSE’s Leap Forward For exchanges, High-Frequency Trading (HFT) firms are a significant revenue driver – generating income through execution, clearing, settlement, colocation, and market data services. Sell-side firms benefit as well, gaining a steady revenue stream and an increased market share. A major...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...