About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Market Data Distribution Parity: Redefining Fairness

Subscribe to our newsletter

By Scott Schweitzer, Independent Consultant, LDA Technologies.

Electronic exchanges play a vital role in the financial industry, providing a robust and trusted forum for trading and execution without issue. But even so, the technology available to exchanges has traditionally led to discrepancies in data distribution, from microseconds to nanoseconds, which can be critical for latency-sensitive strategies. This is especially true for High-Frequency Trading (HFT), where the focus has already shifted from nanoseconds to picoseconds, and every clock cycle and inch of wire can be the difference between making a profit or missing an opportunity.

For this reason, market participants and regulators have been pushing exchanges to address data distribution fairness directly. Examples include the NYSE cable equalization and Regulation NMS, which aim to support competition and efficiency in equities markets, and the Eurex exchange, which continuously works to ensure equal cabling for every market participant. Traditionally, trading firms seek every advantage, and exchanges do everything they can to ensure clients receive data simultaneously. However, most technologies used in market data distribution introduce a level of jitter.

Historical Approaches to Data Fairness

In the early 2000s, as exchanges transitioned from human-centric open-outcry trading to electronic trading, they began using User Datagram Protocol (UDP) multicast to manage subscription-based data feeds to clients. Clients would then receive only the multicast packet addresses of interest to them within their infrastructure, reducing the volume of trading data they need to process. While this makes it easier for exchanges to manage data distribution to clients based on these subscriptions, it also leaves the door open to increased jitter depending on the implementation method.

Around the same time, exchanges also began offering a new service: co-location. This service allowed firms to minimize latency by proximity. However, as these data centers expanded, firms found themselves farther and farther from the matching engine, leading to increased latency (roughly 5 nanoseconds per meter of cable). Due to fairness regulations, exchanges had to resolve the issue by equalizing cable lengths, since latency is a function of cable length, so that both the nearest and farthest racks had the same latency. Of course, trading firms quickly realized that the straighter the cable, the quicker the data, as light flowed through it. They then began requesting racks furthest away from the matching engine, allowing their cables to be fully linear. This followed concerns that light signals bouncing around inside long cables that needed to be coiled increased the latency compared to the straight wires.

Last spring, MOSAIC, a French firm, wrote a white paper on Corrupted Speculative Triggering and then reached out to the EU Commission regarding unfair behavior at Eurex. According to the complaint and the white paper, some firms were sending corrupted PCS codes in network packets to gain an unfair advantage of at least 3.2 ns, in some instances, even more. Techniques like these need to be examined as they are reported, and the Commission’s and Exchanges’ rules need to be reviewed to determine whether any changes are required to policies, processes, and architectures to ensure fairness.

Both exchanges and regulators regularly try to eliminate unfair advantages between market participants and their connection to the exchange, thereby sustaining a competitive market. This means that innovation often happens on the other side of the customer’s transceiver connected to the exchange. From a hardware perspective, in an industry focused on shaving every last nanosecond of latency, firms focused on sustaining an ultra-low latency edge concentrate their hardware capital investments on advantages such as optimized cabling, speed-graded FPGAs, and custom trading chips (ASICs), as well as intelligent traffic routing within their infrastructure, to gain an edge over the competition. Therefore, regulators see exchanges as the primary gatekeepers of market fairness.

The New Focus on Data Fairness

From a market perspective, fairness and transparency are essential to maximize market confidence and attract new players. There is a market-wide effort to minimize unintended advantages across all market participants: regulators, exchanges, and trading firms. Market participants have previously expressed concern that exchanges should further reduce latency discrepancies across their connectivity to ensure fairness. One such example is found in IOSCO’s Market Data in the Secondary Equity Market consultation paper from 2022.

Why Now?

Global regulators have long been monitoring exchanges’ ability to provide fair access; regulations that monitor this include the 1934 Securities Exchange Act in the US and MiFID II in Europe. These regulations have been implemented to ensure a level playing field and create a more robust economy. Exchanges that ensure fair data distribution will also gain a competitive advantage, signaling innovative thinking and market integrity.

However, technology needs to catch up for the markets to be fair for all. Traditional Layer 1 and 3 devices introduce jitter, the delay between client ports. Layer 3 switches, which are traditionally used by exchanges, employ more logic, are nondeterministic and exhibit significantly higher latencies than Layer 1, giving some firms an unintended advantage. For the faster Layer 1 switches, the latency discrepancy can be as significant as three to five nanoseconds, but it is deterministic, meaning that the amount of delay between two ports is fixed and never changes. These devices encounter another problem: they disseminate all data to all clients, regardless of subscription, further overloading tech stacks and slowing them down.

The New Standard in Data Distribution

As exchanges move to support 25 GbE, new challenges arise, such as eliminating jitter at even higher transmission speeds. All of these issues require technological innovation to address challenges previously deemed unsolvable. Although the solution may seem impossible, technological innovations are bringing the industry closer to complete fairness. Today, technology exists that can synchronize data distribution, eliminating timing discrepancies and ensuring that all market participants receive data simultaneously, so that adjusting cable lengths will no longer be necessary. This same technology also supports 25 GbE, which is increasingly being accepted as the upcoming standard for exchange connectivity. Furthermore, the non-deterministic lag, or jitter, between market participants can be reduced from multiple nanoseconds to 50 picoseconds.

For exchanges, investing in fairness isn’t just about compliance; it can also be seen as a competitive advantage, positioning them as innovators and trusted leaders. Exchanges that consistently invest in solutions can minimize anticipated advantages and will naturally enhance their reputation and attract more market players, appease regulators, and prepare for the future regulatory landscape. Even gains measured in picoseconds can directly influence trading outcomes. As a result, fair data distribution, with picosecond precision, is emerging as the natural next step in the evolution of markets, as this is becoming a broader topic of interest within the industry. This translates into increased trust between exchanges and their clients, resulting in a fairer market. When jitter is reduced to a single bit delay between all market participants, it becomes hard to argue for even greater fairness, as no advantage is gained from operating on a single bit.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Market Data Users Flag ‘Important Gaps’ in EU Consolidated Tape Plans

As the European Union forges ahead with its ambitious plan for a consolidated tape (CT), key market data user groups have raised concerns, identifying “important gaps” in the current framework. In a joint letter to the European Securities and Markets Authority (ESMA) and the European Commission, EFAMA, EPTA, and Protiviti have outlined a series of...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...