About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

BMLL Moves into Historical Level 1 and Level 2 Data, Building on Level 3 Granularity

Subscribe to our newsletter

Historical market data and analytics provider BMLL, specialists in the provision of Level 3 data, are now offering historical Level 1 and Level 2 data and analytics, built from the ground up based upon their extensive Level 3 datasets.

Level 3 data comprises the full order book, and includes every individual message sent to and received from the venue. Level 2 data is less granular, aggregating the order book at each price level. Level 1 is narrower still, aggregating volumes at the best bid and offer (i.e. ‘top of book’). Levels 1 and 2 also include matched trades.

BMLL is now drawing upon its Level 3 data engineering expertise, working with the most granular data available, captured at packet level with zero loss, to build its new offerings using a unique ‘bottom up’ method. This contrasts sharply with vendors who take a ‘top down’ approach, basing their historical Level 1 & 2 products on data captured at those levels for their real-time services, which by their nature are less granular and more prone to dropped packets, resulting in gaps and flaws.

“What we do is complex, it’s not something you can build overnight,” BMLL’s CEO Paul Humphrey tells TradingTech Insight. “We take the raw packet capture from nearly 100 venues around the world; every message, every cancellation, every correction, every trade, every order ID, every time stamp, you name it, we’ve collected it in its raw form. And when you think about it, that data contains every trading intention since the middle of last decade. We then perform the heavy lifting by stitching it all together, nanosecond by nanosecond, without losing any individual field from any exchange. And then on top of that, we build the type of analytics that you can only construct with full depth data.”

Humphrey goes on to describe the industry challenge that BMLL’s new offering is designed to address.

“There’s a bit of a problem in the Level 1 and Level 2 world,” he says. “All the suppliers of Level 1 and Level 2 data are real-time providers. They capture that real-time top of book data, store it, wrap it into a package, and sell it as historical data the next day. But when you’re gathering historical data from the real-time feed, it will have gaps, spikes, packet losses, all of those things. We’re addressing those challenges by taking a different approach, building our data from the bottom of the book, not from the top of the book. So we don’t have any missing gaps. And therefore, our Level 1 and Level 2 data, and associated analytics, are pristine.”

Humphrey believes that BMLL is the only provider on the market offering comprehensive Level 1 & 2 data and analytics built on Level 3 data in this way, and that there is a definite market demand for the level of quality this service offers. “With this ever-increasing army of quants across our industry, quality is going to be the differentiator,” he says. “In an increasingly quantitative algo-driven world, quality will be everything. And we’re the only company out there that can offer this level of quality, because we’ve been building this over the last ten years.”

BMLL is now looking at extending its strategic partnerships, following partnership announcements last September with Exegy and Snowflake, who also made a strategic investment in the company, following BMLL’s $26 million Series B investment from Nasdaq Ventures, FactSet, and IQ Capital in Q4 2022.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

FINBOURNE Integrates Agentic AI via MCP to Enable Secure, Real-Time Investment Operations

FINBOURNE Technology has integrated with Claude, the large language model developed by Anthropic, via the Model Context Protocol (MCP), enabling secure, agentic AI across investment operations. The integration allows AI agents to access live investment data, automate workflows, and perform real-time actions while maintaining enterprise-grade governance, compliance, and auditability. Introduced in late 2023, MCP is...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...