About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Trending Technologies

Subscribe to our newsletter

State-of-the-art technologies such as hardware acceleration and in-memory databases increase processing speeds and support high performance analytics, making them first choice for some, but not all, trading environments.

Moderating debate about the promise of big data and high performance analytics technologies at the A-Team Group Data Management Summit in London, A-Team chief content officer Andrew Delaney questioned a panel of experts – two from technology vendors and two from technology consumers – about what technologies are available and which are appropriate for data management.

Patrick Scateni, vice president of sales and marketing at Ciara Technologies, referenced all types of acceleration technology as beneficial, a point agreed by Stuart Grant, EMEA business development manager for financial services at SAP. Grant commented: “Any technology that helps the acceleration of decision making is useful, perhaps in-memory techniques, better communication and collapsed workflows.”

Turning to the technology consumers, Delaney asked them about their approach to big data. Tony Chau, executive director and lead architect, IB CTO at UBS Investment Bank, provided a use case for big data management in the form of Basel Committee regulation BCBS 239. He said: “For BCBS 239, high performance computing is needed and we are using graphics processing units (GPUs) to calculate risk quickly. Using in-memory technology we can also slice and dice data to see many views of risk.”

Richard Bell, fixed income trading manager with responsibility for latency and performance at BNP Paribas, said the term ‘big data’ is, in itself, unhelpful, but added: “There are large unstructured databases such as Cassandra and Mongo that could be described as big data technologies.” Proposing possibilities for managing ever increasing volumes of data, he forecast the coexistence of Cassandra, Mongo and Hadoop with traditional databases, but warned: “Don’t rush to new technologies just because they are shiny and remember it is not technology but data that is important.”

On the question of high speed technologies, Bell said trading is all about speed, but in different contexts. He explained: “Consider scale of speed. For example, risk reporting for an exotics portfolio is a struggle to complete every day, but a large deal capture platform could achieve speeds down to microseconds. The need is to identify appropriate speed and invest accordingly.” Chau added: “Speed is important, but equally important is throughput so that large volumes of data can be analysed on the fly.”

In terms of technology provision and use, Scateni noted a proliferation of hardware acceleration devices in the market, some put in place to accelerate poor coding, and some being field programmable gate arrays (FPGAs). Grant said that over the past few years, in-memory technology has become economically viable for larger business problems. He explained: “Trading transactions and analytics can now be brought together in in-memory solutions. Beyond this, we are looking at FPGAs as a means of moving data to in-memory processing more quickly.”

From a user perspective, Chau said: “We have a number of toys in our toy box. We use GPUs for high performance computing and we also use hardware, software and data technologies from the web to deliver massive parallel computing. Since the crisis, there has been more flow-based tracking. The edge here is analytics, so money spent on analytics is money well spent.”

At BNP Paribas, Bell is looking less at hardware developments and more at solving problems caused by poorly written code. He explained: “When we profile a piece of code, we use tools to see what it does. If it is doing well on input/output and network connections, we might want to speed it up with hardware acceleration. Technology is about the right tools for the job and presenting use cases for investment that will improve the business.”

You can listen to a full recording of the session here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Arcesium Aquata Update Deploys AI to Give ‘Purpose’ to Extracted Data

Giving structure to unstructured data has become indispensable to private market investors, who must deal with what must feel, to the much of rest of the digitised financial world, like relics from antiquity – PDFs, spreadsheets, emails and even paper documents. But the question that hangs over many solutions is what next? What happens to that data...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...