About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Chicago’s Mini-Prop Shop Boom Creates a Boon for Low-Latency Providers

Subscribe to our newsletter

The Volcker Rule, part of the larger Dodd-Frank Financial Reform Act, is an attempt to bar banks from proprietary trading – or making trades for their own benefit – with customer funds. Many banks have pushed back against the proposed rule citing a threat to some of their most profitable activities. In response, the rule has ballooned to almost 300 pages of exemptions and loopholes. After initial fears that the lobbying would delay its completion, a target date of July 2012 has now been set.

In preparation for the new deadline, banks are already disposing of their proprietary trading arms. Cities like Chicago, in particular, are seeing a boom in the creation of new, independent prop shops in the face of regulation. The region is heavily focused on fixed income and commodity derivatives, and market-making is a driving force in the local financial landscape. Add to this an abundance of top schools turning out math, science and engineering students who want to stay in the area and have a proclivity toward investment strategy and you have the perfect climate for a new cottage industry.

New Firms; New Low-Latency Initiatives

In order to compete with established players like Wolverine Trading and Spot Trading, the first challenge for new firms staffed with refugees from the big banks will be making the transition from defining strategies for a large bank to a small firm. Is high-frequency trading, algorithmic trading or another kind of strategy the way to go? And, how much do network performance and computing speed factor into success?

The speed required to be successful in the high-frequency trading space makes it a game for only the fastest technology holders. Some say that anything over 500 microseconds is subpar, and to really be competitive, a system needs to be in and out of the marketplace within 80 microseconds. However, a trade process is only as fast as its slowest component. So if, for example, your matching engine is running at sub 130 nanoseconds but your database can’t run your algorithms fast enough to make real-time trading decisions, there is no advantage to be gained.

The underlying trading strategies have to be as fast as the equipment and the only way to make money with high-frequency trading is to be first to market. Nanoseconds must be shaved off wherever they can be and new low-latency trading vendors continue to emerge with products they claim will coax more speed, better performance and increased reliability from your computing platforms.

Why Here, Why Now?

This wave of new firms, like Akuna Capital and GGT Trading, is not naïve for starting this kind of new businesses in a questionable economy. These individuals are well-versed in electronic trading and they are simply replicating their models for success in a different environment. They are made up of confident investors and talented people from top universities who are capitalizing on the local climate and the number of financial firms who are moving to the area for business continuity reasons or to augment their existing connectivity options with either proximity to the CME Group and its related venues or additional data centers of their own.

They have also tried and tested many of the best latency-lowering solutions available courtesy of the giant budgets of their former employers, so they are very qualified to make technology purchasing decisions and know how to optimise their architectures to gain competitive advantage.

Historically, in Chicago’s proprietary trading shops, there has been a larger trend toward options trading than in other regions. This is, in part, due to derivatives traders in Chicago that left the floor of the exchange to set up their own electronic trading shops. Most of the existing proprietary trading firms in Chicago today started out this way. Some were purchased and rolled into bigger institutions, and now, as the Volcker Rule forces banks to disband their proprietary trading groups, we see that wave continuing as these firms go out on their own. They are creating a new wave of innovation and demand for high volume trade data analysis to help them develop new strategies.

In additional to considering their low-latency architecture carefully, they need to develop their trading strategies and consider how they will process the vast amounts of data their quants will have to manage. When considering data volume, there is a hierarchy based on the asset class. Treasuries require the lowest volumes of data, followed by fixed income, then equities and finally, equity options.

Quant Trader as CEO

With traders traditionally in quant roles now finding themselves CEOs, these firms have the unique advantage that the people making the business decisions actually understand the technology. The IT departments at large firms are usually motivated to centralise technology, hardware storage and processing all in one place. In contrast, newer firms aren’t beholden to this scenario so they can make decisions about data processing and storage based on what makes the most sense for each specific trading application. From this vantage point, they can build a more optimised technology infrastructure and act quickly due to the lack of red tape that came with being part of a bigger firm. With this new hybrid C-level marrying the technology savvy of a CTO with the strategic thinking of a quant trader at the helm, these new firms are poised to define strategy for the industry and continue to develop Chicago into a more powerful North American financial base.

Subscribe to our newsletter

Related content


Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...


S&P Global Market Intelligence Reviews Regulatory Reporting

Resourcing and data quality management are the biggest barriers to effective, accurate and cost-effective transaction regulatory reporting, according to S&P Global Market Intelligence’s annual Global Regulatory Reporting Survey – but it’s not all bad news, with the report noting that financial markets are better prepared for regulatory changes coming in 2024 than in any previous...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...