State-of-the-art technologies such as hardware acceleration and in-memory databases increase processing speeds and support high performance analytics, making them first choice for some, but not all, trading environments.
Moderating debate about the promise of big data and high performance analytics technologies at the A-Team Group Data Management Summit in London, A-Team chief content officer Andrew Delaney questioned a panel of experts – two from technology vendors and two from technology consumers – about what technologies are available and which are appropriate for data management.
Patrick Scateni, vice president of sales and marketing at Ciara Technologies, referenced all types of acceleration technology as beneficial, a point agreed by Stuart Grant, EMEA business development manager for financial services at SAP. Grant commented: “Any technology that helps the acceleration of decision making is useful, perhaps in-memory techniques, better communication and collapsed workflows.”
Turning to the technology consumers, Delaney asked them about their approach to big data. Tony Chau, executive director and lead architect, IB CTO at UBS Investment Bank, provided a use case for big data management in the form of Basel Committee regulation BCBS 239. He said: “For BCBS 239, high performance computing is needed and we are using graphics processing units (GPUs) to calculate risk quickly. Using in-memory technology we can also slice and dice data to see many views of risk.”
Richard Bell, fixed income trading manager with responsibility for latency and performance at BNP Paribas, said the term ‘big data’ is, in itself, unhelpful, but added: “There are large unstructured databases such as Cassandra and Mongo that could be described as big data technologies.” Proposing possibilities for managing ever increasing volumes of data, he forecast the coexistence of Cassandra, Mongo and Hadoop with traditional databases, but warned: “Don’t rush to new technologies just because they are shiny and remember it is not technology but data that is important.”
On the question of high speed technologies, Bell said trading is all about speed, but in different contexts. He explained: “Consider scale of speed. For example, risk reporting for an exotics portfolio is a struggle to complete every day, but a large deal capture platform could achieve speeds down to microseconds. The need is to identify appropriate speed and invest accordingly.” Chau added: “Speed is important, but equally important is throughput so that large volumes of data can be analysed on the fly.”
In terms of technology provision and use, Scateni noted a proliferation of hardware acceleration devices in the market, some put in place to accelerate poor coding, and some being field programmable gate arrays (FPGAs). Grant said that over the past few years, in-memory technology has become economically viable for larger business problems. He explained: “Trading transactions and analytics can now be brought together in in-memory solutions. Beyond this, we are looking at FPGAs as a means of moving data to in-memory processing more quickly.”
From a user perspective, Chau said: “We have a number of toys in our toy box. We use GPUs for high performance computing and we also use hardware, software and data technologies from the web to deliver massive parallel computing. Since the crisis, there has been more flow-based tracking. The edge here is analytics, so money spent on analytics is money well spent.”
At BNP Paribas, Bell is looking less at hardware developments and more at solving problems caused by poorly written code. He explained: “When we profile a piece of code, we use tools to see what it does. If it is doing well on input/output and network connections, we might want to speed it up with hardware acceleration. Technology is about the right tools for the job and presenting use cases for investment that will improve the business.”
You can listen to a full recording of the session here.