About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Chronicle Integrates Latency-Optimised Messaging Framework with KX’s kdb+

Subscribe to our newsletter

London-based low-latency technology specialist Chronicle Software has developed an off-the-shelf integration between its Chronicle Queue messaging framework and KX’s kdb+ high-performance database in partnership with consultancy firm AquaQ.

According to Chronicle CEO Peter Lawrey, over 80% of the Top 100 banks Globally use the company’s Enterprise or Open Source products. Many of these banks are also kdb+ users. The new offering will provide users with a ready-made integration between the two products, something that firms have previously had to build and maintain themselves.

A common use case is where firms need to speed up kdb+ writes, says Lawrey. “They want to be able to firehose data to kdb+ and then have kdb+ pick up that data as quickly as it can. kdb+ processes data a lot quicker if it’s batched, rather than just taking a message at a time. Essentially, clients are using Queue as a big, high-performance buffer, so they can process their data quicker within kdb+”.

Lawrey explains that integrating with third-party platforms such as kdb+ is one of the company’s three main focus areas for the company’s 2021 growth strategy. “We recognise that there is always work to be done by clients when integrating with third-party products. So part of our strategy this year is to provide some pre-built connectors for the most common third-party products we see our clients using.” Potential upcoming integration projects include the Spring Boot framework, Eclipse’s Vert.x toolkit and Apache’s Camel open-source integration framework, Lawrey says.

Another area of focus is in helping firms break down their monolithic applications into microservices, for easier migration to the cloud. “Firms are using Queue to implement their microservices infrastructure, because it’s a very low latency, persistent messaging framework,” he says. “One of the problems with microservices is the overhead in breaking up a monolith into multiple services. We keep that overhead very low and improve performance by breaking up the microservices into much more manageable and tunable chunks.”

The third focus area for Chronicle this year is visualisation. “What we do can be pretty abstract,” says Lawrey. “For example, we care about latency, and often firms can’t see visually how fast their systems are running. Adding visualisation tools will allow our clients to construct and manage latency monitoring services across multiple machines via a GUI, without having to write code.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Market data in the cloud

Over the past several years, the topic of market data in the cloud has been hotly debated – latency has been an issue, which data to put in the cloud has been discussed, and lines have been drawn. But where are we now, and how have the lines been redrawn? This webinar will consider progress...

BLOG

Bloomberg Uses Native Connectivity to Make Data License and B-PIPE Content Accessible to Microsoft Azure Clients

Bloomberg has released a service providing high-performance data to Bloomberg Enterprise Data customers using Microsoft Azure. The service offers access to both real-time trading data and reference, pricing, regulatory and point in time historical datasets via native connectivity to the Azure Virtual Network (VNet). The service includes content from B-PIPE, which was initially made available...

EVENT

ESG Data & Tech Briefing APAC

Join us in one of the greenest cities in the world as we bring together thought leading ESG specialists to explore how financial institutions are adapting to the evolving ESG regulatory and market infrastructure.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...