A-Team Insight Blogs

Share article

Wading through the hysteria to find an intelligent approach to AI

By: Andrew Kouloumbrides, CEO of Xceptor.

Having started its humble beginnings as a device to denote either a utopian dream or a dystopian nightmare in Sci-Fi novels, artificial intelligence (AI) is now well and truly finding practical applications and showing no signs of abating. In fact, it has been estimated that by 2025, the global AI market is expected to be almost $60 billion; a stark contrast to 2016 when it was valued at $1.4 billion.

But how much of the AI hype is still more fiction than fact? Or put another way, can it be used to create tangible, effective change versus AI for AI’s sake? This article explores how and where to implement AI most effectively with regards to data automation, how it needn’t mean ripping and replacing trusted legacy systems, and how the ability to apply AI to the right operations will be essential for industry success.

Growing a digital backbone

Without doubt, the starting point for any AI-led initiative – or any technology project for that matter – should be the lifeblood of any firm. The data. With firms facing data deluges, the starting point should be data assimilation. This is where data is collected, extracted, transformed and normalised, then pushed into any tech that can complete a process. Many firms take a technology-first approach at this stage, which almost always overcomplicates matters, building up even more technical debt and costly overhead.

By taking a data-first approach, firms can redefine their processes, not just replicate them. Only then can it be decided which technology is best deployed, and when. This is truly the key to getting the highest return on digital investment and discovering where true value-add AI can be implemented.

Integrated automation in action

While 90-95% of automation can be achieved through a rules-based approach, true ROI sits with automating complex processes, so that remaining 5-10% cannot be forgotten. However, many so-called automation technologies do just that, in their inability to process PDFs, OCR, or web scraping of client sites for example. Enter integrated automation.

For unstructured data – which makes up over 80% of all organisational data – being able to extract any type of data becomes the key to success. Firms can then deploy traditional rules, machine learning, natural language processing or a hybrid, whatever best suits the redefined process. What is paramount is that each business is given the choice to decide what works best for them. Ideally this shouldn’t be a complex drawn out process needing deep involvement from data scientists or IT departments. Simplicity is really the watchword here. End users should have the power to direct these decisions, as they are at the front line and know first-hand what is needed and when.

Fraudsters beware

AI’s ability to constantly learn and improve make it ideally suited to the area of anti-fraud. This will undoubtedly be a point of interest in an industry where in the UK alone, criminals successfully stole £1.2 billion through fraud and scams in 2018. Firms must be increasingly cognisant of fraudsters who are relentlessly attempting to beat their systems by any means possible.

Many firms will already have a host of flexible and configurable rules in place to deal with this, however, overlaying machine learning or running it side-by-side can help improve detection rates significantly. This is true particularly where data sets are large and identification of fraudulent payments, rather than false positives, needs a greater level of sophistication.

Business rules can produce payment scores, improve fraud detection and limit the number of fraudulent payments. Scoring rules can be refined and improved by the payments team to enable the bank to respond quickly to limit new types of fraudulent activity. However, by employing a combination of machine learning and a rules-based approach, the level of automation and improved detection rates for potentially fraudulent payments can be extended.

Mass email liberation

Another attention-grabber surrounding AI is how it can automate what was always previously perceived to be only achievable manually. This will certainly be relevant to those involved in the area of mass email processing. Many banks receive 3-5,000 emails every day, many of which contain margin calls, netting and standard settlement instructions. The current solution of hiring staff to review each one manually – including hard to decipher and easily misinterpreted short-hand comments that need to read and then pushed to the relevant team to action – is simply not scalable.

The risk of multiple errors is significant, and valuable staff are unable to be deployed on higher value activities. With every division in banks still under pressure to reduce costs and generate revenue, and regulatory initiatives such as IBOR, Brexit and SFTR further piling on the pressure, the momentum to speed up the settlement process is relentless.

With the right AI logic behind them, algorithms can be trained to assess the intent in each email, classify each one, allocate the right priority, and send the email to be actioned by the right team. This is a powerful example of where operational staff roles can change as AI replaces one mundane aspect of the job with a higher skilled one as they help train new models – augmenting, rather than replacing, existing processes.

Flexibility not profligacy

Technological advancements in this field are evident, and it is easy to get caught up in AI-mania when it is pervasive in every media outlet you turn to. However, it is simply not feasible to replace the minefield of incumbent systems that are embedded and essential to running a bank.

Nor is it a prudent exercise. To quote McKinsey: “[Re]building systems in house proves to be a difficult, resource-draining task that often costs organizations from €50 million to €300 million”.  Therefore, in order to unlock near term ROI and new cost structures from a digital infrastructure, any AI solution that is selected must be flexible enough to either extend the life of, or straddle, existing technologies.

Pragmatism for the path ahead

AI is by no means a new phenomenon, but as the convergence of new technologies such as cloud-computing and big data present new applications for AI, the question now is how and where to implement it most effectively. As the financial services industry continues to evolve and new pressures are put upon institutions such as the threat of cyber-attack, new regulations or challenger institutions, the ability to apply AI to the right operations will be essential for industry success.

This requires a well thought out plan for accessing and leveraging existing or new data assets, and the best returns are driven by a clear definition of the problem and the solution desired. What is evident is that a pragmatic view is always best when choosing the right path and the right partner to guide you into the future.


*McKinsey Digital ‘Overhauling banks’ IT systems’

Leave a comment

Your email address will not be published. Required fields are marked *

*

Related content

WEBINAR

Recorded Webinar: Leveraging innovation in data management

Financial Institutions are going through considerable change as they look to deploy innovative technologies and solutions that could radically change data management processes and deliver operational and business benefits. This webinar will review innovation in data management – including technologies such as machine learning, pattern matching, artificial intelligence and solutions such as utilities and managed...

BLOG

Capgemini Marches on with IDMS Reference Data Utility Despite Talk of Pullback

Whatever happened to the Capgemini (nee iGate) Integrated Data Management Services (IDMS) utility? With little or no market visibility in the past 12-18 months, many in the industry surmised that new owners Capgemini had developed cold feet in the face of slow expansion. When the utility’s sales group in London and New York was folded...

EVENT

Data Management Summit New York City

Now in its 8th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...