About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

StreamBase Lowers Latency, Speeds Up Development

Subscribe to our newsletter

With the release of version 7 of its complex event processing (CEP) offering, StreamBase Systems is taking a swipe at latency while addressing usability to make creation of applications faster and easy enough for a business user to tackle.

Overall performance has apparently been improved by 20%, together with the introduction of fine-grain control of that age-old trade-off, reducing latency at the expense of throughput, and vice versa. That’s important for low-latency applications where proprietary code is often written to get a performance edge over packaged applications and code generators.

There has also been “microsecond-level” optimisations on the connectivity side, as well as a new FIX protocol handler (currently in beta) that has benchmarked at 50 to 75 microseconds for sending messages, depending on size. That performance is some 75% faster than other FIX engines, says the company, though it will continue to support alternatives and does not plan to compete head on in that space. That said, it knows it is on to a good thing, and is pricing its FIX handler at a subscription of $50K per year, a considerable premium to the other connectivity handlers that it offers.

And speaking of connectivity, the new version has added 22 handlers, including for messaging platforms from Informatica (29West), NYSE Technologies (everyone still calls it Wombat), Solace Systems and Tervela. And responding to the increase in algo trading outside of equities, it’s also added 10 handlers for FX market sources, including alternative trading systems and pricing services from individual banks.

The rapid deployment – and ongoing maintenance – of applications has also been targeted, with an overhaul of the StreamBase Studio visual development environment. Officially, the new interface “dramatically increases the speed and quality of application innovation,” while StreamBase CEO Mark Palmer – citing Thomas Edison no less – articulates it as “I make more mistakes than anyone else I know, and sooner or later, I patent them.” His point being that it makes it easy to prototype new trading ideas, back test them, and tweak them to perfection.

Without getting into the nitty gritty of the new development interface, it essentially allows for the graphical layout of code modules and connections between components, with easy technical option configurations, allowing non technical staff to “read the code.” As such, it’s likely to be popular among those designing algorithmic trading, order-routing and pre-trade risk functionality.

In fact, one customer – SunGard – is using StreamBase to add pre-trade controls to its Valdi trade management system. “The usability of StreamBase Studio helped us develop a new Valdi product, we used it to describe requirements collaboratively with IT and because we were working with the actual code, we made better architectural decisions early that made for a better first release,” says, Chris Lees, vice president of SunGard’s global trading business.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

Beyond the Blueprint: Integrating Data Fabric and Data Mesh in Capital Markets

The demands placed upon modern trading infrastructures, driven by increasing data volumes, the mandate for real-time processing, and stringent regulatory requirements, are exposing the limitations of historical data architectures. In response, capital markets firms are accelerating the re-evaluation of their data strategies to secure greater agility, scalability, and enhanced governance. A recent webinar hosted by...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...