About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Low Latency Summit: Low Latency Gives Way to Big Data and Analytics

Subscribe to our newsletter

The pressure to achieve ever lower latency in trading has not eased in the six or so years since low latency became an industry buzz, but it is beginning to ease as the pure need for speed gives way to growing interest in smart analytics and big data.

Opening this week’s A-Team Group Low Latency Summit in London, Pete Harris, editor and publisher of LowLatency.com, set the scene for a day of panel discussions and workshops on the development of low latency and its shift into a wider trading environment including big data and analytics. Harris noted the use of low latency to move data quickly in support of trade executions, high frequency trading, arbitrage and market making, but also increasing interest in low latency data processing, focusing on application logic, in-memory storage or pre- and post-trade analytics, in support of intelligent trading.

He explained: “Speed is no longer a differentiator. It is important to trade fast, but it is increasingly important to make the right trades. The need is for data driven analytics using big fast data.”

Having set the scene, Harris kick off the first panel discussion of the day, entitled ‘Low Latency: No Longer a Strategy’. He asked panel members what strategies are still considered to be very latency sensitive. Terry Keene, CEO and president of iSys Capital Technologies, said: “With latency down to hundreds of nanoseconds or single microseconds it’s a tough market, so how do firms differentiate? The need is to take other data and incorporate it into your own data in real time to deliver intelligent trading. The new world is not dependent only on low latency, but also on how well you can process data at speed.”

Nick Idelson, technical director at TraderServe, added: “For some time, speed has been achieved using hardware, but we are seeing a push into hybrid platforms that use software for calculations and field programmable gate arrays to trigger trades. Trades never go to the central processing unit and the outcome is low latency and smart trading.”

With a focus on analytics, James Davies, chief operating officer of Global Markets Exchange Group, commented that the quality of analytics in products is not as high as it could be. By way of example, he described swap execution facilities in the US that deal with interest rate swaps. If they use arbitrage, prices are unlikely to be correct, so they need pre- and post-trade risk. “The quickest pre-trade platform for risk is jointly owned and used by a group of general clearing members. The latency to check risk is six milliseconds. That is a lot of delay and leaves a lot of room for an arms race,” he said.

Considering analytics and big data, Harris questioned the importance and use of these elements for trading. Martin Duchá?ek, head of development at RSJ, answered: “Technology is at the edge of performance. Next year, we will see analytics taking a bigger role in trading and market making as data growth is enormous and makes processing more difficult. It will become a must to incorporate better smart filtering of data if better decisions are to be made. In a few years’ time, every trading strategy will include complex analytics. Just as low latency has been a must have, smart analytics will be a must have for the next two or three years.”

Keene concurred, adding: “Twenty years’ ago the cycle took 24 hours, now firms want to trade in real time, and that is a real challenge. Technology is moving in the right direction, but people running market trades are only just beginning to think about how analytics can be used with big data. The need is for technology and a change in mindset.”

On the issue of whether investment is shifting from low latency to big data and analytics, panel opinion suggested it is shifting, but not without including scrutiny of data. Idelson explained: “First, it is important to make sure data is good and clean. With data coming from many sources, the question is whether it matches up and whether there is any jitter. The science comes first and then the data can be put into a solutions.” Davies commented: “Decisions on the trading floor follow trends, so there will be spending on big data. Pre-trade risk will be a focus, too, as it is included in many forthcoming regulations.”

In terms of return on investment, Duchá?ek suggested: “Big data is coming and if you take advantage of big data and are smart you can make money. But what does smart mean? Until now, smart has been all about low latency, but it will be all about analytics, analytics on big data and measures of speed of compute on bigger and bigger data. This is similar to low latency, but more complex as it must be smart. This will be a great space for smart people and smart analytics.”

Returning to Davies’ mention of regulation, Harris questioned the influence of regulation on trading. Davies responded: “Regulation is dominant. Firms have to partake of swap execution facilities and, beyond the most mature listed stocks and derivatives, regulation is shaping the industry.” While this is true, Davies pointed out a widely held industry view that regulators are implementing European Markets Infrastructure Regulation (EMIR) to its utmost degree, beyond its initial intention and in a way that restricts business.

Commenting on this and the potential inclusion of a 500 millisecond resting time in EMIR, Idelson said: “Technology will be needed to trade very quickly at the end of the resting time. Without it, you will be toast or won’t trade in Europe any more.” Duchá?ek agreed, concluding: “This solves nothing. If it is implemented, market makers and active traders will migrate to non-regulated markets.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Integrating Intelligent Machine Readable News

Intelligent machine readable news is a powerful tool in the arsenals of trading and investment firms seeking competitive advantage. It turns unstructured data into actionable insight and can be used, for example, to uncover market trends, identify correlations and evaluate sentiment. In turn, it can inform quant strategies and predictive models. While machine readable news...

BLOG

Intelligent Machine Readable News – How to Get a Signal from the Noise

Intelligent machine readable news is a powerful tool in the arsenal of trading firms seeking competitive advantage. It offers opportunities to turn unstructured data into actionable insight that can be used to uncover market trends, identify correlations and evaluate sentiment, but also raises challenges such as information sourcing, timing and contextualisation. A recent A-Team Group...

EVENT

Data Management Summit USA Virtual (Redirected)

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...