About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Andrew Delaney: The Convergence Continues

Subscribe to our newsletter

More evidence this week that market practitioners are embracing intelligence in trading with news that a major global investment bank has implemented a risk and data management platform that appears to make a significant step toward the Holy Grail of incorporating enterprise analytics into the pre-trade decision-making process.

This is precisely the kind of thing we’ll be talking about at our Intelligent Trading Summit in New York on May 13. Indeed, keynotes from Fidessa’s Steve Grob and Tibco’s Alessandro Petroni, will offer views on best approaches. And the day’s kickoff panel session will look at the challenges of Implementing an Intelligent Trading Architecture.

This latest use-case example, though, appears to point to even further convergence: a kind of melding of the front- and middle-/back-office data architectures to bring the full might of the bank’s internal intelligence to bear on trading decisions.

While we don’t know the identity of the bank in question, we do know that the enabling data management platform makes use of in-memory database technology from Quartet FS. The implementation of Quartet FS’s ActivePivot analytical and transactional processing system is yielding new insight into key data points, allowing more accurate and timely trade decision-making.

According to Georges Bory, managing director at Quartet FS, the bank’s existing risk management and analytics platforms were being stretched by massive volumes of data – 20 billion data points over a 10-day period – with the result that managers could access only fully aggregated reports that were generated by inflexible inquiries. In the wake of trading scandals at the likes of JP Morgan and Societe Generale, the worry was that these reports could be ‘gamed’ by those who understood the parameters of the inquiries, potentially exposing the bank to fraudulent activity.

With over 20 billion figures processed in a 10 day period, the volumes were high enough to cause difficulties for the most advanced of analytics engines. Existing market risk analysis in the bank was restricted to what the existing technology could cope with: a fully aggregated and inflexible reporting mechanism that hampered real-time analysis.

According to Quartet FS, the key challenges were:

  • Create a market risk analysis system that allowed analysts to not only be able to see current aggregated risk in an easily consumable format, but also hold enough granular detail to support identification of anomalies in single data streams.
  • Create a pinpoint/alert mechanism for anomalies among all metrics that would highlight unplanned events. This means a manual search through data is not needed when a deviation from normal fluctuation limits occurs.
  • Give analysts the ability to drill in the data around the error to look for the origin of the anomaly without being limited in the scope of the analysis nor in the level of detail.
  • Introduce the ability to create new metrics on-the-fly, as analysis into anomalies progresses.

The bank’s solution, according to Bory, was to deploy ActivePivot as a pre-aggregation system that uses a scoring system to identify potential problems before they reach existing risk and trade-processing systems. The rules of the scoring system are set by the internal controller on an ad-hoc basis, rather than relying on standard queries, which can be gamed by anyone with knowledge of the system.

The result is a data analysis and management engine that can present fast moving streams of data at any level of granularity, enabling a complete view of all positions held from full portfolio level down to a single transaction. Analysts can also create limit breach alerts to monitor for unexpected events.

The impetus for the project, Bory says, came from board level, although the methodology of the implementation was set by the risk department. The work flow set in place allows the bank to kill several birds with one stone: financial controllers and compliance officers get the validation they need to ensure the bank is playing by the rules set by regulators; and business users get more and higher-quality data for use in their trading decision-making. All exceptions are flagged and dealt with before information is passed along to business users.

Bory reckons ActivePivot’s use of in-memory was a key factor in winning the deal. The huge quantities of data involved needed to be applied to complex risk calculations very rapidly.

Traders working on different portfolios may be using different Greeks, or sensitivity analyses, to measure and manage risk, and the bank needed to be sure they were working with consistent data across these different models. Ultimately, the bank decided that alternative solutions like NoSQL weren’t up to the job, given the need to process these calculations as fast as possible.

This all makes great grist for the mill for the debate on May 13. I hope you can join us.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: New trends and technologies influencing post-trade digitalisation

Date: 27 September 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes While digital transformation of front-office functions at financial institutions is well underway, the back office is lagging, calling on firms to reassess and innovate post-trade processes. The need for change is highlighted by specific issues, including the move towards...

BLOG

MarketAxess Selects DataBP to Automate Data Licensing and Post-Sales Processes

MarketAxess, operator of an electronic trading platform for fixed income securities and provider of market data and post-trade services for global fixed income markets, is working with DataBP, a data licensing and commercial management platform, to support and automate its data licensing and post-sales processes. DataBP says that as data consumption methods in financial services...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative (Redirected)

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Trading Regulations Handbook 2022

Welcome to the third edition of A-Team Group’s Trading Regulations Handbook, a publication designed to help you gain a full understanding of regulations that have an impact on your trading operations, data and technology. The handbook provides details of each regulation and its requirements, as well as ‘at-a-glance’ summaries, regulatory timelines and compliance deadlines, and...