About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Researchers Harness Supercomputers for Game Changing Financial Markets Analysis

Subscribe to our newsletter

Researchers from a number of U.S. universities, tapping into supercomputing power and optimising code, believe they have made a “game changing” step in the speedy analysis of financial markets, reducing operations from hours or even days, down to minutes.

The researchers from the University of Illinois, the University of Pittsburgh and the University of San Diego are making use of parallel processing supercomputer capacity provided via XSEDE – the Extreme Science and Engineering Discovery Environment – in order to run analytics, and have optimised their code to provide as much as a 126x speedup compared to previous analysis undertaken in 2010.

That performance boost, which the researchers reported at last month’s XSEDE conference in San Diego, will allow them to analyse market phenomenon with nanosecond time granularity across the entire Nasdaq equities market in just a couple of hours. This includes the impact of high frequency and other low-latency trading strategies, and could be used to detect whether frowned upon strategies, such as quote stuffing, have been deployed.

XSEDE provides the research community with access to a variety of IT resources, including 16 supercomputers, visualisation and data analysis systems. The supercomputers used by this particular research comprise:

* Blacklight at the Pittsburgh Supercomputing Centre. Blacklight is an Intel-based SGI shared memory system intended for applications that require a large memory space for computational tasks.
* Gordon at the San Diego Supercomputer Centre. Gordon is a Flash-based supercomputer, also incorporating Intel chips, designed in partnership with Appro (now Cray) for data intensive workloads.
* Stampede at the Texas Advanced Computing Centre, University of Texas at Austin. Stampede – currently cited as the sixth most powerful supercomputer in the world – was designed in collaboration with Dell, and includes Intel multi-core and Xeon Phi many-core processors for highly parallel computational processing.

Code optimisation has thus far focused on several areas required to build a limit order book for each security, including the input and output of data and its pre-processing to convert it into suitable formats for computation.

That optimised code is then executed in parallel for each security in market, with only a small subset requiring a lengthy order book construction phase. Once an order book has been constructed for each security, then analytics can be run. 

Previous research by members of the team and others – using just Blacklight – into the impact of the non-reporting of ‘odd lot’ trades that are commonly executed by HFT strategies, on overall volume, suggested that nearly 5% of volume was omitted. Partly as a result of that research, regulators plan to introduce reporting of such trades this October.

The deployment of supercomputing technology – especially parallel processing and data intensive/in-memory computing – is likely to become more common for such functions as back-testing and strategy construction, as trading firms adopt intelligent approaches that are not simply dependent of low-latency execution.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

When Margin Moves Upstream: How TT is Reworking Trading Decisions After the OpenGamma Deal

More than a month after completing its acquisition of OpenGamma, Trading Technologies is beginning to articulate how the deal is intended to change the way firms think about margin, capital efficiency, and trading decision-making. Rather than positioning margin as a downstream risk or treasury concern, TT is now framing capital efficiency as a front-office variable...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...