The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data – The Other Side of Low Latency

We write a lot here about the latency of moving data from point A to point B. But latency is also inherent in the processing of that data at points A and B. Most likely, the processing of data is a more complex undertaking than transporting it. And that’s where big data comes in.

For sure, big data (or Big Data as other publications refer to it) is the tech buzzword – and investment focus – of the moment, not unlike low latency has been in the past. Also, like low latency, the definition of big data is fluid, and is often bent to suit a particular vendor’s business. We’re no exception, so here’s a definition we use at BigDataForFinance.com

“Datasets whose characteristics – size, data type and frequency – are beyond efficient processing, storage and extraction by traditional database management tools.”

Note that it’s not all about size, and that’s especially true for the financial markets, where even many years of time series tick data does not come close to the data volumes processed by the likes of Google and Facebook. But what the financial markets might lack in data size, it makes up for in complexity and frequency.  And the need for accuracy and precision.

One of the more common big data applications relates to the storage of and analytics on tick-by-tick time series data.  Here, the need is to capture data that is hitting at rates up to several million updates per second, for markets such as North American options. This generally requires pretty specialised, in-memory, approaches, since even massively parallel processing (MPP) databases – think EMC Greenplum, IBM Netezza and ParAccel – are not going to keep up. Vendors such as OneMarketData and Kx Systems are likely to be called upon for such storage, the latter perhaps paired with Kove’s RAM-based storage appliance.

Time series applications include creation and back-testing of quantitative trading models, some pre-trade risk checks, and transaction cost analysis (TCA). In the future, time series could support more complex execution algorithms too.

Another increasing big data application is processing of natural language text and analytics on real time and historical textual information in order to derive trading signals from news sources and social media. As an example, so-called Sentiment Analysis might process text messages related to a company or market segment from Twitter, building a view on whether the subject of the tweets is being referred to in a positive or negative manner. This sentiment – combined with other inputs – can be used to make trading decisions ahead of systems the key off price changes in the market.

The processing of big data generally requires grid or cluster infrastructure, with networks connecting 10s, 100, or 1000s of servers and processing nodes. No wonder then, that many of the network and middleware vendors that are engaged in low-latency connectivity also have a ‘story’ for big data. Names such as Informatica, Tibco Software, Cisco Systems, Arista Networks, Solarflare Communications, Mellanox Technologies, Tervela and Solace Systems all spring to mind. A couple of those – Informatica and Tibco – also offer more traditional big data analysis applications too.

One technology often mentioned in the context of big data is Hadoop, which is an open source implementation of the MapReduce framework for the processing of very large datasets, such as searching for data patterns. It too is a multi-server parallel approach, but one that is historically batch oriented. One direction for it is to make it more real time, introducing in-memory and low-latency middleware to boost performance. A real time Hadoop approach would lend itself to driving electronic trading and on-demand determination of risk across an enterprise.

As automated trading systems move to leverage cloud-based infrastructure, then accessing cloud-based big data services will be a natural route to take, leading to faster implementation and less ongoing management.

As we have suggested before, for many trading firms, tapping into the convergence of low latency, cloud and big data technologies will be the way to go. You’ll be hearing more on this architectural approach – so stay tuned!

Related content

WEBINAR

Recorded Webinar: The evolution of market surveillance across sell-side and buy-side firms

Market surveillance is crucial, and in many cases a regulatory requirement, to ensuring orderly securities markets and sustaining confidence in trading. It can be breached and has become increasingly complex in the wake of the Covid pandemic, Brexit, and the emergence of new asset classes. This webinar will review the extent of market abuse in...

BLOG

Beyond Cryptocurrencies – Building a Market Infrastructure for Digital Assets

Mention digital assets, and most people will immediately think of cryptocurrencies. This is perhaps not surprising, as ownership of digital assets is currently dominated by the two leading cryptocurrencies, Bitcoin and Ethereum. The fast-evolving world of digital assets now goes way beyond the realm of cryptocurrencies, however, and has the potential to open up a...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...