About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Big Data – The Other Side of Low Latency

Subscribe to our newsletter

We write a lot here about the latency of moving data from point A to point B. But latency is also inherent in the processing of that data at points A and B. Most likely, the processing of data is a more complex undertaking than transporting it. And that’s where big data comes in.

For sure, big data (or Big Data as other publications refer to it) is the tech buzzword – and investment focus – of the moment, not unlike low latency has been in the past. Also, like low latency, the definition of big data is fluid, and is often bent to suit a particular vendor’s business. We’re no exception, so here’s a definition we use at BigDataForFinance.com

“Datasets whose characteristics – size, data type and frequency – are beyond efficient processing, storage and extraction by traditional database management tools.”

Note that it’s not all about size, and that’s especially true for the financial markets, where even many years of time series tick data does not come close to the data volumes processed by the likes of Google and Facebook. But what the financial markets might lack in data size, it makes up for in complexity and frequency.  And the need for accuracy and precision.

One of the more common big data applications relates to the storage of and analytics on tick-by-tick time series data.  Here, the need is to capture data that is hitting at rates up to several million updates per second, for markets such as North American options. This generally requires pretty specialised, in-memory, approaches, since even massively parallel processing (MPP) databases – think EMC Greenplum, IBM Netezza and ParAccel – are not going to keep up. Vendors such as OneMarketData and Kx Systems are likely to be called upon for such storage, the latter perhaps paired with Kove’s RAM-based storage appliance.

Time series applications include creation and back-testing of quantitative trading models, some pre-trade risk checks, and transaction cost analysis (TCA). In the future, time series could support more complex execution algorithms too.

Another increasing big data application is processing of natural language text and analytics on real time and historical textual information in order to derive trading signals from news sources and social media. As an example, so-called Sentiment Analysis might process text messages related to a company or market segment from Twitter, building a view on whether the subject of the tweets is being referred to in a positive or negative manner. This sentiment – combined with other inputs – can be used to make trading decisions ahead of systems the key off price changes in the market.

The processing of big data generally requires grid or cluster infrastructure, with networks connecting 10s, 100, or 1000s of servers and processing nodes. No wonder then, that many of the network and middleware vendors that are engaged in low-latency connectivity also have a ‘story’ for big data. Names such as Informatica, Tibco Software, Cisco Systems, Arista Networks, Solarflare Communications, Mellanox Technologies, Tervela and Solace Systems all spring to mind. A couple of those – Informatica and Tibco – also offer more traditional big data analysis applications too.

One technology often mentioned in the context of big data is Hadoop, which is an open source implementation of the MapReduce framework for the processing of very large datasets, such as searching for data patterns. It too is a multi-server parallel approach, but one that is historically batch oriented. One direction for it is to make it more real time, introducing in-memory and low-latency middleware to boost performance. A real time Hadoop approach would lend itself to driving electronic trading and on-demand determination of risk across an enterprise.

As automated trading systems move to leverage cloud-based infrastructure, then accessing cloud-based big data services will be a natural route to take, leading to faster implementation and less ongoing management.

As we have suggested before, for many trading firms, tapping into the convergence of low latency, cloud and big data technologies will be the way to go. You’ll be hearing more on this architectural approach – so stay tuned!

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Smart Trader Desktops: Placing UX at the front and centre of the trading workflow

Date: 15 October 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Trading strategy is in place, the technology stack is optimised and the trading team is highly skilled – but what about the user experience? Whatever the stack, the desktop, the trading apps and their functionality, a trading platform is...

BLOG

AI in Capital Markets Summit Tracks Evolution of GenAI and Value Creation

Generative AI (GenAI) took the world by storm in November 2022 when OpenAI introduced ChatGPT. It has since become a talking point across capital markets as financial institutions review its potential to deliver value, consider the challenges it raises, and question whether they have the data foundation in place to deliver meaningful, unbiased and ethical...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...