The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Learning From Mistakes: Applying Big Data to Realtime Risk Analysis

by Daniel Wakefield, IMGROUP
www.imgroup.com

As the financial sector once again reels under yet more allegations of huge losses (this time at JPMorgan Chase) there is a more than mild sense of déjà vu here at IMGROUP. Yet again, questions are being asked about how one desk, perhaps even one trader, can lose such a significant amount of money without anybody appearing to notice. The jury is still out as to whether we are talking about illegal activity and losses being deliberately hidden or whether it is a case of ignorance caused by the complexity of the system. In some ways however this is irrelevant.  The fact remains that huge losses still seem to be easy to hide – wilfully or not.

The huge complexity in managing a bank’s risk position when handling multiple transactions of this nature is clear, so it is not surprising that the bank was unable to keep track of the risks it was running, however it is surprising that it was allowed to happen.  This is where the regulator and bank have to work together to solve the problem.

It is no coincidence that near real-time risk analytics is one of the fastest growing areas for IMGROUP.  I say ‘near’ because real time risk analytics is virtually impossible due to the huge volumes of data, multiple sources and the speed with which things change.  The problem for banks is that while the need for greater transparency and accuracy around the trades grows – especially in the area of risk – the amount of data and the number of sources delivering it is growing almost exponentially.  Add to this the sheer complexity of the IT infrastructure in a global investment bank and you have a near perfect storm.

The issue of complexity is right at the heart of the problems that these banks face.  Traditionally, banks have simply layered new systems and applications over old ones.  Most of them are designed in house and in some cases over half of what exists is redundant.  Navigating this complex web of systems to try and understand which bits of information are where, which are important and where they need to go is a major headache for CIOs and IT teams across the sector.  As regulations demand information on a bank’s position faster and faster the strain on the technology becomes immense. Add to this the fact that the traders and analysts themselves build highly complex and enormous technology toolkits, unintelligible to all but the person that designed them, and you can see how it happens that £billions can be lost and no one notices.

When you break it down there are essentially four main areas that have to be addressed if situations like the JPMorgan one are to be prevented:

1. Data management cannot just be about accuracy it also needs to be about visibility.  Who gets to see the information, what do they get to see (the history of where it has been, who else has seen it, what comments and ultimately what changes have been made) is as important as the information itself.  As front and back offices become physically more distributed but from a business perspective, more closely integrated, how do you make sure that everyone sees what they need to see when they need to see it?

2. The ability to ‘connect the dots’ across multiple, siloed systems.  One of the biggest issues for banks when it comes to understanding their exposure to risk is the fact that different products and trading desks operate independently.  You cannot have an accurate view across the bank unless you have a system in place that can connect them and the analytical tools available to give you a view across all of them.  This in turn means that you have to have a central platform for risk calculation. The added benefit is that this also reduces the impact on the user and improves both efficiency and agility.

3. Usability is not an after thought.  The effectiveness of any technology is down to the people that use it.  If systems are too complex and hard to use people won’t use them.   All too often architects design to their own specifications.  They view the system through their own eyes but what is acceptable to a system architect is usually very different from what a trader or even an analyst would happily use on a daily basis.

4. Complexity reduction has to be a priority.  For too long banks have just added layer upon layer of functionality and applications to an infrastructure already creaky under the strain.  This means that all too often the IT department itself has only limited knowledge of what exists and who is using what.  In an environment this complex getting anything done quickly or accurately, let alone both together, is very difficult.

Banks need to take on these challenges now or JPMorgan won’t be the last to hit the headlines.

Related content

WEBINAR

Recorded Webinar: Brexit: Reviewing the regulatory landscape and the data management response

With Brexit behind us and the UK establishing its own regulatory regime having failed to reach equivalence with the EU, financial firms face challenges of double reporting, uncertainty about UK regulation, and a potential exodus of top talent. The data management response is not easy and could stretch some firms to the limit as they...

BLOG

Raidne Goes Live with Siren FX Benchmark

London-based Raidne has gone live with the first client for its recently launched Siren FX fixing benchmark. The quantitative FX surveillance says the independent benchmark – which was designed to offer an alternative to established fixes that have been subject to market manipulation – is being used by a European asset manager, executing through a...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual brings together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

MiFID II Handbook – Second Edition

With the compliance deadline for Markets in Financial Instruments Directive II (MiFID II) just over two months away, A-Team Group has updated its MiFID II handbook to bring you the latest details on the regulation’s compliance requirements. Version 2 of the handbook, commissioned by Thomson Reuters, also includes new sections covering data sourcing and data...