About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Andrew Delaney: The Convergence Continues

Subscribe to our newsletter

More evidence this week that market practitioners are embracing intelligence in trading with news that a major global investment bank has implemented a risk and data management platform that appears to make a significant step toward the Holy Grail of incorporating enterprise analytics into the pre-trade decision-making process.

This is precisely the kind of thing we’ll be talking about at our Intelligent Trading Summit in New York on May 13. Indeed, keynotes from Fidessa’s Steve Grob and Tibco’s Alessandro Petroni, will offer views on best approaches. And the day’s kickoff panel session will look at the challenges of Implementing an Intelligent Trading Architecture.

This latest use-case example, though, appears to point to even further convergence: a kind of melding of the front- and middle-/back-office data architectures to bring the full might of the bank’s internal intelligence to bear on trading decisions.

While we don’t know the identity of the bank in question, we do know that the enabling data management platform makes use of in-memory database technology from Quartet FS. The implementation of Quartet FS’s ActivePivot analytical and transactional processing system is yielding new insight into key data points, allowing more accurate and timely trade decision-making.

According to Georges Bory, managing director at Quartet FS, the bank’s existing risk management and analytics platforms were being stretched by massive volumes of data – 20 billion data points over a 10-day period – with the result that managers could access only fully aggregated reports that were generated by inflexible inquiries. In the wake of trading scandals at the likes of JP Morgan and Societe Generale, the worry was that these reports could be ‘gamed’ by those who understood the parameters of the inquiries, potentially exposing the bank to fraudulent activity.

With over 20 billion figures processed in a 10 day period, the volumes were high enough to cause difficulties for the most advanced of analytics engines. Existing market risk analysis in the bank was restricted to what the existing technology could cope with: a fully aggregated and inflexible reporting mechanism that hampered real-time analysis.

According to Quartet FS, the key challenges were:

  • Create a market risk analysis system that allowed analysts to not only be able to see current aggregated risk in an easily consumable format, but also hold enough granular detail to support identification of anomalies in single data streams.
  • Create a pinpoint/alert mechanism for anomalies among all metrics that would highlight unplanned events. This means a manual search through data is not needed when a deviation from normal fluctuation limits occurs.
  • Give analysts the ability to drill in the data around the error to look for the origin of the anomaly without being limited in the scope of the analysis nor in the level of detail.
  • Introduce the ability to create new metrics on-the-fly, as analysis into anomalies progresses.

The bank’s solution, according to Bory, was to deploy ActivePivot as a pre-aggregation system that uses a scoring system to identify potential problems before they reach existing risk and trade-processing systems. The rules of the scoring system are set by the internal controller on an ad-hoc basis, rather than relying on standard queries, which can be gamed by anyone with knowledge of the system.

The result is a data analysis and management engine that can present fast moving streams of data at any level of granularity, enabling a complete view of all positions held from full portfolio level down to a single transaction. Analysts can also create limit breach alerts to monitor for unexpected events.

The impetus for the project, Bory says, came from board level, although the methodology of the implementation was set by the risk department. The work flow set in place allows the bank to kill several birds with one stone: financial controllers and compliance officers get the validation they need to ensure the bank is playing by the rules set by regulators; and business users get more and higher-quality data for use in their trading decision-making. All exceptions are flagged and dealt with before information is passed along to business users.

Bory reckons ActivePivot’s use of in-memory was a key factor in winning the deal. The huge quantities of data involved needed to be applied to complex risk calculations very rapidly.

Traders working on different portfolios may be using different Greeks, or sensitivity analyses, to measure and manage risk, and the bank needed to be sure they were working with consistent data across these different models. Ultimately, the bank decided that alternative solutions like NoSQL weren’t up to the job, given the need to process these calculations as fast as possible.

This all makes great grist for the mill for the debate on May 13. I hope you can join us.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows

Date: 26 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows Emerging capabilities in AI and interoperability are transforming trading workflows, with the promise of heightened levels of collaboration and personalisation resulting in greater efficiency and performance. The potential...

BLOG

Tradeweb Integrates Repo and Interest Rate Swap Trading, Enhancing Client Workflows

Tradeweb Markets Inc, the global operator of electronic marketplaces for rates, credit, equities, and money markets, has launched new features integrating its repurchase agreements (repo) and interest rate swap (IRS) product offerings, in what the company claims is an industry first. The enhancement aims to improve clients’ execution workflows across both markets. “This integration has...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...