About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Corvil Webinar Discusses the Problems of Packet Loss and Some Potential Solutions

Subscribe to our newsletter

By Zoe Schiff

Packet loss caused by network congestion or packet corruption can make trading strategies irrelevant and cause unwanted costs, but these problems can be resolved by using tools that provide visibility of packet loss and its effects.

A recent webinar presented by James Wiley, director of technical product marketing at network data analytics specialist Corvil, set out the problems of packet loss and its effects on user experience, applications and latency, as well as some solutions. Wiley noted that packet loss is particularly evident when applications are running at sluggish rates, video streams buffer, websites take forever to load, there are timeouts and reconnections, and, in the case of voice over IP handsets, they produce robotic sounds.

Real time, interactive media applications are the most susceptible to packet loss due to their reliance on the User Datagram Protocol (UDP), but packet loss can be equally disruptive in a real time trading environment, upsetting trading strategies and costing money rather than earning it.

Wiley noted some examples of packet delays, saying: “Google estimates that half a second of extra delay results in 20% fewer searches, while Yahoo estimates that 0.4 seconds of delay results in 5-9% less page traffic. Amazon statistics stand out, estimating that an increase in delay of one-tenth of a second can translate into a 1% drop in sales.”

There are two primary causes of packet loss, congestion and corruption. Congestion occurs when there is a bandwidth restriction, either physical or imposed. Physical bandwidth restrictions on a network link during times of peak activity cause bandwidth exhaustion, leading to buffer overflow and packet loss. Imposed restrictions used to improve quality of service taper certain traffic flows to allow higher priority flows access to the network link. Corruption results from problems such as poor quality cabling and faulty hardware. These problems cause parts of a packet to be corrupted during transmission with the result that the packet is then cast aside.

Monitoring and diagnostics tools can help to identify these types of problems, and packet loss can be minimised once its cause is known. Corruption is relatively straightforward as hardware failings can be fixed. Congestion is more complex, but there are solutions. Increasing buffering is a low-cost way to approach congestion, but this solution introduces more latency, making it sub-optimal for latency sensitive applications. Another approach is to increase bandwidth, which is likely to be a more costly solution, but will decrease serialisation delay. The downside here is traffic elasticity as bandwidth consumers such as file transfers open large Transmission Control Protocol (TCP) windows, devouring available bandwidth at the expense of other applications.

While these solutions have pros and cons, the crucial component of any solution is summed up by Wiley, who concludes: “Visibility is the key to understanding both packet loss and its effects.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Duco Unveils Next-Gen AI Tools to Tackle T+1 and Data Complexity

Duco, the data automation company, has launched a suite of next-generation AI-powered automation tools, aimed at helping financial institutions manage growing data volumes, accelerate regulatory compliance, and prepare for the transition to T+1 settlement in the UK and Europe. The launch includes three core innovations: an Agentic Rule Builder, AI-Native Data Prep, and T+0 Assurance...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...