You don’t have to scratch far below the surface of the artificial intelligence hype machine to see that many financial institutions are experiencing challenges in implementing the technology.
Our own Data Management Insight annual preview in January of predictions for the coming year found that vendors and users alike reported the dawning of a realisation that, for all its promise, AI is not going to be easy to get right. Anecdotal evidence suggests organisations are finding the cost-to-benefit ratio of integration is higher than expected and that the expertise required to make a success of AI is difficult to find.
The biggest challenge they have found, however, is getting their data foundations right to accommodate AI. Putting a sticking plaster on shortcomings in their existing technology stacks and data pipelines is not doing the trick, observers have noted.
Among them are Marc Schröter, chief product officer at SimCorp, which provides investment management technology and data solutions for buy-side firms. According to Denmark-based Schröter, firms are facing pressure from three sources: the need to reduce costs in an increasingly competitive and volatile climate; the need to bring operational efficiencies; and, the imperative to innovate, which is shorthand for improving the effective use of data.
Whole-Structure Review
While that’s encouraged firms to look at outsourcing and consolidation, they haven’t been wholly successful in improving their AI propositions.
“Firms are finding that they are not super well equipped for that,” Schröter tells Data management Insight. “Of course, some have consolidated their technology stack over the past years, but it really means that people are taking a step back and are really embarking on a complete review of their operating model.
“They’re saying, ‘okay, yes, we can do some things, but really, if we want to do all these things, we probably need to review our whole structure. What do we do ourselves? What do we outsource? What does our system landscape look like? But also, how do we get access to the data we need to make better decisions and to create AI models?’”
This holistic approach to AI is particularly important for buy-side firms because of their peculiar technology needs and operational challenges.
Volatility has encouraged investors that once focussed on public markets to seek out new and exotic allocations for their capital. Most notably, that has meant the shifting of portfolios towards private an alternative markets. Preqin estimates that global investment in private markets could reach $24.5 trillion by 2028.
This has required a rethink of their data strategies, which had hitherto been centered on structured sources that were relatively easy to ingest and manage. Now, they are having to find new ways to pull information from unstructured sources and marry that with their existing data estate.
“We are seeing the rise of private market investments,” says Schröter. “It used to be maybe a niche focus or you had a few products, but now it’s really getting to a size where you have public and you have private market systems and data, and they are largely in separate silos.
“That means that if you want to look across your total portfolio, it’s super difficult to get the data out,” he adds. “If you want to do something more specific with the data, you can try to consolidate and so on, but if you want to do more active scenario management, risk management, and so on, that is a challenge.”
AI Understanding
These challenges were highlighted in a survey conducted by SimCorp, which found that while three-quarters of respondents said they understood the potential benefits of AI, they said that they needed more information to integrate it into their systems.
The use cases they identified as most likely to benefit from the technology are investment analysis, decision making, risk management, data management and client engagement. In response, 67 per cent of the 200 companies questioned said they plan to build more standardised data modelling and 65 per cent said they would consolidate systems for a common data layer to overcome challenges with their data infrastructure.
Two-fifths said that they intended improving data and operations for multi-asset investment strategies, which chimed with three-fifths of respondents citing the inability to manage multi-assets in one view as their biggest front-office challenge.
Schröter said that the challenges are most acute in companies that have a fragmented data foundation. This can have resulted from the acquisition of companies that used different data architectures and were never subsequently consolidated. Silo-isation of data could also have resulted from the buy-side’s expansion into multiple new asset classes.
Alternatively, adds Schröter, it could have resulted from outsourcing of specific processes and their accompanying datasets.
“A lot of companies find themselves in a situation where they have outsourced part of their business and this has meant that it’s not just the process but they have also outsourced the data to these service providers, meaning that they oftentimes need to mirror part of it back into their own systems for necessary oversight and control,” he says. “If they want to have access to this data when they want to build AI models it’s not ideal– it’s a break in the data value chain.”
Schröter is quick to espouse the benefits of outsourcing as a means of focussing buy-side firms on what they do best: investing. However, they need to do it with a broad view of their data estate.
“People are now looking to see if they can holistically look at this across all asset classes, all of their data,” he says. “That means that that customers can focus on the activities where they can differentiate, such as client services, investments and risk management. Combined with the cost pressure we see in the industry, people are just looking to see what and where they can add value.”
Subscribe to our newsletter