About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data – The Other Side of Low Latency

Subscribe to our newsletter

We write a lot here about the latency of moving data from point A to point B. But latency is also inherent in the processing of that data at points A and B. Most likely, the processing of data is a more complex undertaking than transporting it. And that’s where big data comes in.

For sure, big data (or Big Data as other publications refer to it) is the tech buzzword – and investment focus – of the moment, not unlike low latency has been in the past. Also, like low latency, the definition of big data is fluid, and is often bent to suit a particular vendor’s business. We’re no exception, so here’s a definition we use at BigDataForFinance.com

“Datasets whose characteristics – size, data type and frequency – are beyond efficient processing, storage and extraction by traditional database management tools.”

Note that it’s not all about size, and that’s especially true for the financial markets, where even many years of time series tick data does not come close to the data volumes processed by the likes of Google and Facebook. But what the financial markets might lack in data size, it makes up for in complexity and frequency.  And the need for accuracy and precision.

One of the more common big data applications relates to the storage of and analytics on tick-by-tick time series data.  Here, the need is to capture data that is hitting at rates up to several million updates per second, for markets such as North American options. This generally requires pretty specialised, in-memory, approaches, since even massively parallel processing (MPP) databases – think EMC Greenplum, IBM Netezza and ParAccel – are not going to keep up. Vendors such as OneMarketData and Kx Systems are likely to be called upon for such storage, the latter perhaps paired with Kove’s RAM-based storage appliance.

Time series applications include creation and back-testing of quantitative trading models, some pre-trade risk checks, and transaction cost analysis (TCA). In the future, time series could support more complex execution algorithms too.

Another increasing big data application is processing of natural language text and analytics on real time and historical textual information in order to derive trading signals from news sources and social media. As an example, so-called Sentiment Analysis might process text messages related to a company or market segment from Twitter, building a view on whether the subject of the tweets is being referred to in a positive or negative manner. This sentiment – combined with other inputs – can be used to make trading decisions ahead of systems the key off price changes in the market.

The processing of big data generally requires grid or cluster infrastructure, with networks connecting 10s, 100, or 1000s of servers and processing nodes. No wonder then, that many of the network and middleware vendors that are engaged in low-latency connectivity also have a ‘story’ for big data. Names such as Informatica, Tibco Software, Cisco Systems, Arista Networks, Solarflare Communications, Mellanox Technologies, Tervela and Solace Systems all spring to mind. A couple of those – Informatica and Tibco – also offer more traditional big data analysis applications too.

One technology often mentioned in the context of big data is Hadoop, which is an open source implementation of the MapReduce framework for the processing of very large datasets, such as searching for data patterns. It too is a multi-server parallel approach, but one that is historically batch oriented. One direction for it is to make it more real time, introducing in-memory and low-latency middleware to boost performance. A real time Hadoop approach would lend itself to driving electronic trading and on-demand determination of risk across an enterprise.

As automated trading systems move to leverage cloud-based infrastructure, then accessing cloud-based big data services will be a natural route to take, leading to faster implementation and less ongoing management.

As we have suggested before, for many trading firms, tapping into the convergence of low latency, cloud and big data technologies will be the way to go. You’ll be hearing more on this architectural approach – so stay tuned!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

Exegy Acquires NovaSparks to Accelerate Convergence at the FPGA Layer

Exegy, the low-latency market data, trading, and execution technology provider, has agreed to acquire NovaSparks Inc., the specialist in Field Programmable Gate Array (FPGA) enabled market data and trading products. Exegy’s move to bring NovaSparks into the group signals a clear intent to exert deeper control over the FPGA-driven market data pipeline, from normalisation and...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...