About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Learning From Mistakes: Applying Big Data to Realtime Risk Analysis

Subscribe to our newsletter

by Daniel Wakefield, IMGROUP

As the financial sector once again reels under yet more allegations of huge losses (this time at JPMorgan Chase) there is a more than mild sense of déjà vu here at IMGROUP. Yet again, questions are being asked about how one desk, perhaps even one trader, can lose such a significant amount of money without anybody appearing to notice. The jury is still out as to whether we are talking about illegal activity and losses being deliberately hidden or whether it is a case of ignorance caused by the complexity of the system. In some ways however this is irrelevant.  The fact remains that huge losses still seem to be easy to hide – wilfully or not.

The huge complexity in managing a bank’s risk position when handling multiple transactions of this nature is clear, so it is not surprising that the bank was unable to keep track of the risks it was running, however it is surprising that it was allowed to happen.  This is where the regulator and bank have to work together to solve the problem.

It is no coincidence that near real-time risk analytics is one of the fastest growing areas for IMGROUP.  I say ‘near’ because real time risk analytics is virtually impossible due to the huge volumes of data, multiple sources and the speed with which things change.  The problem for banks is that while the need for greater transparency and accuracy around the trades grows – especially in the area of risk – the amount of data and the number of sources delivering it is growing almost exponentially.  Add to this the sheer complexity of the IT infrastructure in a global investment bank and you have a near perfect storm.

The issue of complexity is right at the heart of the problems that these banks face.  Traditionally, banks have simply layered new systems and applications over old ones.  Most of them are designed in house and in some cases over half of what exists is redundant.  Navigating this complex web of systems to try and understand which bits of information are where, which are important and where they need to go is a major headache for CIOs and IT teams across the sector.  As regulations demand information on a bank’s position faster and faster the strain on the technology becomes immense. Add to this the fact that the traders and analysts themselves build highly complex and enormous technology toolkits, unintelligible to all but the person that designed them, and you can see how it happens that £billions can be lost and no one notices.

When you break it down there are essentially four main areas that have to be addressed if situations like the JPMorgan one are to be prevented:

1. Data management cannot just be about accuracy it also needs to be about visibility.  Who gets to see the information, what do they get to see (the history of where it has been, who else has seen it, what comments and ultimately what changes have been made) is as important as the information itself.  As front and back offices become physically more distributed but from a business perspective, more closely integrated, how do you make sure that everyone sees what they need to see when they need to see it?

2. The ability to ‘connect the dots’ across multiple, siloed systems.  One of the biggest issues for banks when it comes to understanding their exposure to risk is the fact that different products and trading desks operate independently.  You cannot have an accurate view across the bank unless you have a system in place that can connect them and the analytical tools available to give you a view across all of them.  This in turn means that you have to have a central platform for risk calculation. The added benefit is that this also reduces the impact on the user and improves both efficiency and agility.

3. Usability is not an after thought.  The effectiveness of any technology is down to the people that use it.  If systems are too complex and hard to use people won’t use them.   All too often architects design to their own specifications.  They view the system through their own eyes but what is acceptable to a system architect is usually very different from what a trader or even an analyst would happily use on a daily basis.

4. Complexity reduction has to be a priority.  For too long banks have just added layer upon layer of functionality and applications to an infrastructure already creaky under the strain.  This means that all too often the IT department itself has only limited knowledge of what exists and who is using what.  In an environment this complex getting anything done quickly or accurately, let alone both together, is very difficult.

Banks need to take on these challenges now or JPMorgan won’t be the last to hit the headlines.

Subscribe to our newsletter

Related content


Upcoming Webinar: Modernising Legacy Data Infrastructure to Create Agile and Accessible Digital Platforms

Date: 27 April 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes In a highly competitive trading environment with little room for manoeuvre, financial institutions are looking to move on from legacy data infrastructure and build proprietary high-speed, low-latency trading systems offering agility, accelerated time-to-market, and potentially, competitive advantage. This is...


Pico Launches Corvil Cloud Analytics

Pico, the technology services, software, data and analytics provider, has expanded its flagship monitoring platform into the cloud with the launch of Corvil Cloud Analytics. Pico’s Corvil Analytics is already one of the most widely used platforms for extracting and correlating performance intelligence from networked environments, typically deployed by banks, exchanges, electronic market makers, quantitative...


FinCrime Tech Briefing, London

RegTech Insight (from A-Team Group) is proud to announce the launch of its FinCrime Tech Briefing taking place in both London and New York this summer and focusing on RegTech for AML and Financial Crime Compliance.


The Data Management Challenges of Client Onboarding and KYC

This special report accompanies a webinar we held on the popular topic of The Data Management Challenges of Client Onboarding and KYC, discussing the data management challenges of client onboarding and KYC, and detailing new technology solutions that have the potential to automate and streamline onboarding and KYC processes. You can register here to get immediate...