About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Learning From Mistakes: Applying Big Data to Realtime Risk Analysis

Subscribe to our newsletter

by Daniel Wakefield, IMGROUP
www.imgroup.com

As the financial sector once again reels under yet more allegations of huge losses (this time at JPMorgan Chase) there is a more than mild sense of déjà vu here at IMGROUP. Yet again, questions are being asked about how one desk, perhaps even one trader, can lose such a significant amount of money without anybody appearing to notice. The jury is still out as to whether we are talking about illegal activity and losses being deliberately hidden or whether it is a case of ignorance caused by the complexity of the system. In some ways however this is irrelevant.  The fact remains that huge losses still seem to be easy to hide – wilfully or not.

The huge complexity in managing a bank’s risk position when handling multiple transactions of this nature is clear, so it is not surprising that the bank was unable to keep track of the risks it was running, however it is surprising that it was allowed to happen.  This is where the regulator and bank have to work together to solve the problem.

It is no coincidence that near real-time risk analytics is one of the fastest growing areas for IMGROUP.  I say ‘near’ because real time risk analytics is virtually impossible due to the huge volumes of data, multiple sources and the speed with which things change.  The problem for banks is that while the need for greater transparency and accuracy around the trades grows – especially in the area of risk – the amount of data and the number of sources delivering it is growing almost exponentially.  Add to this the sheer complexity of the IT infrastructure in a global investment bank and you have a near perfect storm.

The issue of complexity is right at the heart of the problems that these banks face.  Traditionally, banks have simply layered new systems and applications over old ones.  Most of them are designed in house and in some cases over half of what exists is redundant.  Navigating this complex web of systems to try and understand which bits of information are where, which are important and where they need to go is a major headache for CIOs and IT teams across the sector.  As regulations demand information on a bank’s position faster and faster the strain on the technology becomes immense. Add to this the fact that the traders and analysts themselves build highly complex and enormous technology toolkits, unintelligible to all but the person that designed them, and you can see how it happens that £billions can be lost and no one notices.

When you break it down there are essentially four main areas that have to be addressed if situations like the JPMorgan one are to be prevented:

1. Data management cannot just be about accuracy it also needs to be about visibility.  Who gets to see the information, what do they get to see (the history of where it has been, who else has seen it, what comments and ultimately what changes have been made) is as important as the information itself.  As front and back offices become physically more distributed but from a business perspective, more closely integrated, how do you make sure that everyone sees what they need to see when they need to see it?

2. The ability to ‘connect the dots’ across multiple, siloed systems.  One of the biggest issues for banks when it comes to understanding their exposure to risk is the fact that different products and trading desks operate independently.  You cannot have an accurate view across the bank unless you have a system in place that can connect them and the analytical tools available to give you a view across all of them.  This in turn means that you have to have a central platform for risk calculation. The added benefit is that this also reduces the impact on the user and improves both efficiency and agility.

3. Usability is not an after thought.  The effectiveness of any technology is down to the people that use it.  If systems are too complex and hard to use people won’t use them.   All too often architects design to their own specifications.  They view the system through their own eyes but what is acceptable to a system architect is usually very different from what a trader or even an analyst would happily use on a daily basis.

4. Complexity reduction has to be a priority.  For too long banks have just added layer upon layer of functionality and applications to an infrastructure already creaky under the strain.  This means that all too often the IT department itself has only limited knowledge of what exists and who is using what.  In an environment this complex getting anything done quickly or accurately, let alone both together, is very difficult.

Banks need to take on these challenges now or JPMorgan won’t be the last to hit the headlines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Trade South Africa: Considerations for Connecting to and Trading the Johannesburg Markets

Interest among the international institutional community in trading South African markets is on the rise. With connectivity, data and analytics options for trading on the Johannesburg Stock Exchange growing more sophisticated, and the emergence of A2X as a credible alternative equity market, South Africa is shaping up as a financial centre that can offer a...

BLOG

Countdown to T+1: Market Readiness and the Push for Automation

As the North American T+1 settlement deadlines fast approach – 28th May 2024 for all US securities settled through DTC and 27th May for Canadian securities – the industry has been ramping up in preparation. And although the shorter settlement cycle promises to increase settlement efficiency, improve liquidity, decrease counterparty risk and reduce overall trading...

EVENT

RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...