The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Andrew Delaney: The Convergence Continues

More evidence this week that market practitioners are embracing intelligence in trading with news that a major global investment bank has implemented a risk and data management platform that appears to make a significant step toward the Holy Grail of incorporating enterprise analytics into the pre-trade decision-making process.

This is precisely the kind of thing we’ll be talking about at our Intelligent Trading Summit in New York on May 13. Indeed, keynotes from Fidessa’s Steve Grob and Tibco’s Alessandro Petroni, will offer views on best approaches. And the day’s kickoff panel session will look at the challenges of Implementing an Intelligent Trading Architecture.

This latest use-case example, though, appears to point to even further convergence: a kind of melding of the front- and middle-/back-office data architectures to bring the full might of the bank’s internal intelligence to bear on trading decisions.

While we don’t know the identity of the bank in question, we do know that the enabling data management platform makes use of in-memory database technology from Quartet FS. The implementation of Quartet FS’s ActivePivot analytical and transactional processing system is yielding new insight into key data points, allowing more accurate and timely trade decision-making.

According to Georges Bory, managing director at Quartet FS, the bank’s existing risk management and analytics platforms were being stretched by massive volumes of data – 20 billion data points over a 10-day period – with the result that managers could access only fully aggregated reports that were generated by inflexible inquiries. In the wake of trading scandals at the likes of JP Morgan and Societe Generale, the worry was that these reports could be ‘gamed’ by those who understood the parameters of the inquiries, potentially exposing the bank to fraudulent activity.

With over 20 billion figures processed in a 10 day period, the volumes were high enough to cause difficulties for the most advanced of analytics engines. Existing market risk analysis in the bank was restricted to what the existing technology could cope with: a fully aggregated and inflexible reporting mechanism that hampered real-time analysis.

According to Quartet FS, the key challenges were:

  • Create a market risk analysis system that allowed analysts to not only be able to see current aggregated risk in an easily consumable format, but also hold enough granular detail to support identification of anomalies in single data streams.
  • Create a pinpoint/alert mechanism for anomalies among all metrics that would highlight unplanned events. This means a manual search through data is not needed when a deviation from normal fluctuation limits occurs.
  • Give analysts the ability to drill in the data around the error to look for the origin of the anomaly without being limited in the scope of the analysis nor in the level of detail.
  • Introduce the ability to create new metrics on-the-fly, as analysis into anomalies progresses.

The bank’s solution, according to Bory, was to deploy ActivePivot as a pre-aggregation system that uses a scoring system to identify potential problems before they reach existing risk and trade-processing systems. The rules of the scoring system are set by the internal controller on an ad-hoc basis, rather than relying on standard queries, which can be gamed by anyone with knowledge of the system.

The result is a data analysis and management engine that can present fast moving streams of data at any level of granularity, enabling a complete view of all positions held from full portfolio level down to a single transaction. Analysts can also create limit breach alerts to monitor for unexpected events.

The impetus for the project, Bory says, came from board level, although the methodology of the implementation was set by the risk department. The work flow set in place allows the bank to kill several birds with one stone: financial controllers and compliance officers get the validation they need to ensure the bank is playing by the rules set by regulators; and business users get more and higher-quality data for use in their trading decision-making. All exceptions are flagged and dealt with before information is passed along to business users.

Bory reckons ActivePivot’s use of in-memory was a key factor in winning the deal. The huge quantities of data involved needed to be applied to complex risk calculations very rapidly.

Traders working on different portfolios may be using different Greeks, or sensitivity analyses, to measure and manage risk, and the bank needed to be sure they were working with consistent data across these different models. Ultimately, the bank decided that alternative solutions like NoSQL weren’t up to the job, given the need to process these calculations as fast as possible.

This all makes great grist for the mill for the debate on May 13. I hope you can join us.

Related content

WEBINAR

Recorded Webinar: Best practice data governance for GDPR compliance

Don’t miss this opportunity to view the recording of this recently held webinar. General Data Protection Regulation (GDPR) comes into force on May 25, 2018, replacing and extending data privacy rules set down in 1995. This time around, the key to successful implementation is strong and sustainable data governance. The webinar will discuss the role...

BLOG

Low-Code: The New Standard for Sell-Side Fixed Income Desks

By Vuk Magdelinic, CEO of Overbond. Low-code programming isn’t coming — it’s already here. And 2021 is the Year of Low-Code, according to IT publications such as SD Times and TechRepublic. “The worldwide low-code development technologies market is projected to total $13.8 billion in 2021, an increase of 22.6% from 2020,” according to the latest forecast by Gartner. The...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2018/2019 – Sixth Edition

In a testament to the enduring popularity of the A-Team Regulatory Data Handbook, we are delighted to publish a sixth edition for 2018-19 of our comprehensive guide to all the regulations and rules that might impact data and data management at your institution. As in previous editions of the Regulatory Data Handbook, we have updated...