Intelligent trading architecture is emerging as technology developments support a move beyond trading systems based purely on speed and allow a focus on fast yet more intelligent solutions. The challenges and opportunities of implementing an intelligent trading architecture were discussed at last week’s A-Team Group Intelligent Trading Summit in New York, where conference chair Pete Harris was joined by experts from OneMarketData, Tibco Software and Quartet Financial Systems.
Alessandro Petroni, senior principal architect, financial services, at Tibco expanded on his earlier keynote presentation, saying the focus of new trading platforms would be on integrating them across all lines of business. Louis Lovas, director of solutions at OneMarketData, provided an example of this on the asset management side of the industry, where technologies such as predictive analytics that have been used on the quant side for some time are being used in an effort to cut costs.
The transition of functions from end of day to real time was also acknowledged by panelists as essential to future trading architectures. Jean Safar, chief technology officer at Quartet FS, noted that back-office functions had furthest to go in this respect. Looking at the components that are being integrated into new trading systems, the panel agreed that the emphasis must be on toolsets that are used to build models. Lovas said model creation and quant research will be key components to evolving these systems, while Safar said the emphasis needs to be on unifying individual components to reduce not only latency, but also computation issues that reduce the agility of systems as a whole. Petroni noted the need to move towards streaming analytics to support early processing and closer to real-time visualisation of trading.
Considering the complexity of managing vast quantities of data in the trading environment, Harris asked how firms will transform the data into a more useable form. Safar suggested a feedback loop that allows network data, trades and risk analytics to be integrated so that all pre- and post-trade data is kept together within the system.
Petroni broadened the data management concern, saying that one of the new challenges firms would face would be how to tackle unstructured data, such as LinkedIn posts, Tweets, or Facebook updates that could potentially contain valuable information if firms could filter out other ‘noise’. Lovas said OneMarketData had researched this area and referenced a recent company white paper that noted bank interest in trying to capture social sentiment, but the difficulty of addressing false positives.
Delving deeper into the debate and taking questions from the audience, the panel considered the downsides of trading automation. After a moment of humour about the Flash Boys controversy and turning to serious concerns regarding Knight Capital and the 2010 Flash Crash, the panel concurred that the main downside of trading developments is how to mitigate risk. Lovas highlighted firms’ fears of rogue algorithms and noted that of pressure to create robust systems that will not be affected by the next catastrophe. Petroni said that, in essence, the problem is that while firms have been putting a lot of effort into
engineering faster systems, the effort put into monitoring those systems has been minimal by comparison.
After addressing these concerns, the panel responded to an audience questioned about the agility of a firm, and whether this is represented by technology underlying trading platforms, or by a firm’s methodology and the way it structures its teams. Safar said that in either case, the siloed model of doing business was counter-productive to achieving overall agility. In future, firms would need not just better communication among their teams, but also the ability to deploy systems anywhere and access them at any time. Lovas agreed, saying that agility is about how firms respond to competition, integrating new code-writing or toolset developments quickly and back testing them sufficiently before deploying them across a highly scalable infrastructure.
Finally, the experts took a look at the challenges ahead and how intelligent trading might evolve. Safar suggested firms are moving on to evolve their data and computation together, using languages such as Python and many processing cores being used in sync to take advantage of system intelligence. Lovas argued that measurable performance would be the focus of many firms, with benchmarks and performance analysis in the trade lifecycle process critical in the face of competition.
With the focus of discussion revolving around the evolution of intelligent trading architectures, Petroni concluded that such architecture would ultimately be able to adapt and morph as quickly as markets move. The challenges, he said, would be in managing complexity and keeping the trading architecture agile and open to future development.