About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Learning From Mistakes: Applying Big Data to Realtime Risk Analysis

Subscribe to our newsletter

by Daniel Wakefield, IMGROUP
www.imgroup.com

As the financial sector once again reels under yet more allegations of huge losses (this time at JPMorgan Chase) there is a more than mild sense of déjà vu here at IMGROUP. Yet again, questions are being asked about how one desk, perhaps even one trader, can lose such a significant amount of money without anybody appearing to notice. The jury is still out as to whether we are talking about illegal activity and losses being deliberately hidden or whether it is a case of ignorance caused by the complexity of the system. In some ways however this is irrelevant.  The fact remains that huge losses still seem to be easy to hide – wilfully or not.

The huge complexity in managing a bank’s risk position when handling multiple transactions of this nature is clear, so it is not surprising that the bank was unable to keep track of the risks it was running, however it is surprising that it was allowed to happen.  This is where the regulator and bank have to work together to solve the problem.

It is no coincidence that near real-time risk analytics is one of the fastest growing areas for IMGROUP.  I say ‘near’ because real time risk analytics is virtually impossible due to the huge volumes of data, multiple sources and the speed with which things change.  The problem for banks is that while the need for greater transparency and accuracy around the trades grows – especially in the area of risk – the amount of data and the number of sources delivering it is growing almost exponentially.  Add to this the sheer complexity of the IT infrastructure in a global investment bank and you have a near perfect storm.

The issue of complexity is right at the heart of the problems that these banks face.  Traditionally, banks have simply layered new systems and applications over old ones.  Most of them are designed in house and in some cases over half of what exists is redundant.  Navigating this complex web of systems to try and understand which bits of information are where, which are important and where they need to go is a major headache for CIOs and IT teams across the sector.  As regulations demand information on a bank’s position faster and faster the strain on the technology becomes immense. Add to this the fact that the traders and analysts themselves build highly complex and enormous technology toolkits, unintelligible to all but the person that designed them, and you can see how it happens that £billions can be lost and no one notices.

When you break it down there are essentially four main areas that have to be addressed if situations like the JPMorgan one are to be prevented:

1. Data management cannot just be about accuracy it also needs to be about visibility.  Who gets to see the information, what do they get to see (the history of where it has been, who else has seen it, what comments and ultimately what changes have been made) is as important as the information itself.  As front and back offices become physically more distributed but from a business perspective, more closely integrated, how do you make sure that everyone sees what they need to see when they need to see it?

2. The ability to ‘connect the dots’ across multiple, siloed systems.  One of the biggest issues for banks when it comes to understanding their exposure to risk is the fact that different products and trading desks operate independently.  You cannot have an accurate view across the bank unless you have a system in place that can connect them and the analytical tools available to give you a view across all of them.  This in turn means that you have to have a central platform for risk calculation. The added benefit is that this also reduces the impact on the user and improves both efficiency and agility.

3. Usability is not an after thought.  The effectiveness of any technology is down to the people that use it.  If systems are too complex and hard to use people won’t use them.   All too often architects design to their own specifications.  They view the system through their own eyes but what is acceptable to a system architect is usually very different from what a trader or even an analyst would happily use on a daily basis.

4. Complexity reduction has to be a priority.  For too long banks have just added layer upon layer of functionality and applications to an infrastructure already creaky under the strain.  This means that all too often the IT department itself has only limited knowledge of what exists and who is using what.  In an environment this complex getting anything done quickly or accurately, let alone both together, is very difficult.

Banks need to take on these challenges now or JPMorgan won’t be the last to hit the headlines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

CQG Unveils AI Predictive Model for Traders in Futures Market, Tested at 80% Accuracy

CQG, the financial markets technology solutions provider, has announced the successful completion of testing of its artificial intelligence (AI) predictive model for traders, boasting an impressive 80% accuracy rate in forecasting the future movements of the E-mini S&P 500 futures contract. The newly developed machine learning (ML) toolkit is designed to provide retail traders and...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...