About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DASH Adds Dark Liquidity Aggregation Algo

Subscribe to our newsletter

DASH Financial Technologies continues to innovate with SENSOR Dark, a next-generation dark liquidity aggregation algorithm designed to provide traders with optimal levels of transparency, performance and control as they attempt to minimise footprints when seeking dark liquidity in a fragmented market. The algo sources dark liquidity from significant venues across the US equity market, including independent venues, broker-operated pools, exchange hidden liquidity and conditional pools.

In line with all DASH execution solutions, users can customise their own SENSOR Dark execution strategies to meet specific performance and workflow goals. They can also view, measure, refine and visualise their activity using DASH’s web-delivered transparency solution, DASH360, which provides real-time analytics and visualisation to bring an order to life.

Stino Milito, head of electronic trading sales and co-chief operating officer at DASH, says: “While simple dark aggregation tools have been available in the market for some time, most were developed at a time when the liquidity landscape looked much different than it does today. SENSOR Dark has been designed with the functionality and real-time analytics necessary to effectively source dark liquidity today.”

With SENSOR Dark, traders can use a data-driven approach to customise routing selection by liquidity, block execution, price reversion/stability or any custom measurement; benefit from ‘block react’, a workflow solution that enables reaction to block execution with immediate and dynamic reallocation functionality; add ‘alpha seek’, performance enhancing workflow that dynamically changes venue selections available to take advantage when a symbol is outperforming versus the arrival price; define minimum fill size on a per-order and venue-by-venue basis, as well as set a minimum first fill to ensure a minimum-size dark print at the outset of the order; and have price flexibility by sourcing liquidity anywhere within the National Best Bid and Offer, including the midpoint or far touch, the offer when buying or the bid when selling.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

From Silos to Sequencers: Why Core Trading Architectures Are Being Rewritten for 24/7 Markets

The most consequential changes facing financial markets technology in 2026 will not be driven by new asset classes or incremental latency gains, but by a fundamental rethinking of how trading systems are architected at their core. For decades, market participants have organised technology around functional silos: execution, risk, middle office, post-trade. These boundaries were reinforced...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....