The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

UK Asset Managers Are Aware of Dangers of Not Aggregating Data But Standards Are Lacking, Says SimCorp

The UK asset management community is aware of the dangers posed by failing to adequately consolidate and aggregate their data but underlying problems remain, according to a recent report by risk management platform provider SimCorp and independent consultant Paul Miller. The problems are not to do with technology or organisation, explains Cath Rawcliffe, vice president of sales and marketing at SimCorp, but rather the lack of standardisation in the data space.

SimCorp decided to look into the data aggregation space in response to particular problems being experienced by its clients, says Rawcliffe. “Margins are being squeezed in the asset management sector as assets under management (AUM) and therefore revenues have fallen. Yet requirements for information have increased, especially in the area of risk,” she explains. “At SimCorp, we have encountered prospective customers who expend considerable time and resources aggregating data and who find their processes inflexible – some took literally weeks to determine counterparty exposures after the collapse of Lehman Brothers, for example, yet our own customers, on our single database platform, SimCorp Dimension, had this information in hours. So we thought it would be interesting to try both to understand the issue better and to get a feel for the extent of the problem,” Rawcliffe continues. According to the respondents to the study, most senior executives at UK asset management firms believe that their ability to undertake new business initiatives is hampered by the need for data aggregation, especially in the area of risk. They therefore recognise the need to pull together data to support key investment, control and reporting functions, but problems still remain. Rawcliffe reckons this recognition is an important factor for the future development of the data space. A large number of respondents to the survey have ‘high’ or ‘medium’ levels of aggregation process duplication across business functions. Most said their aggregation processes were ‘somewhat automated’ or better and nearly all firms indicated that they needed to perform aggregation processes at least daily. “It was a surprise though that some firms need to aggregate several times a day, says Rawcliffe. “We had to add an extra answer choice of ‘hourly’ to this question after the first two pilot interviews.” Data warehouses do not seem to have resolved the problem, says Rawcliffe, whether one has been deployed or not, spreadsheets and small database applications containing ‘private’ stores of data are found everywhere. Moreover, technology and organisation are not the main problems. “Rather, a lack of industry standards – or perhaps more accurately the plethora of so-called ‘standards’ – in the way data is defined, constraints around the timing of the availability of data (for example, output from upstream systems) and dependency on third parties (outsourced service providers and data vendors) are the main sources of difficulty,” she contends. The results of the study therefore indicate that majority of firms have a high dependency on spreadsheets and bespoke databases and even those that have deployed a data warehouse still need to store the results of aggregation processes in multiple locations. The vast majority of respondents also indicated that they feel that the number of systems they deploy could be reduced. “This is especially interesting, given our single database approach, in that it also emerged that those with simpler application landscapes tended to find aggregation less onerous,” says Rawcliffe. Data aggregation systems are proving to be the problem, according to the vendor. “These processes are creating a data straightjacket which stifles business flexibility,” says Rawcliffe. “It is evident that the more complex the operational platform, the more difficult it is to extract useful information from it. Yet even though the vast majority of firms participating in the study said that they could reduce the number of systems they have in use today, many continue to add further complexity in the form of more processes and data stores. This study indicates that the more this approach is followed, the more difficult it becomes to adapt as business requirements change.” Rawcliffe contends that while these systems appear to solve immediate problems, the continued introduction of new aggregation processes further entangles already complicated operations. “The study indicates that this constitutes a significant operational risk,” she adds. As well as the management of data, the timing of aggregation processes and reliance on third parties to either perform aggregation functions or deliver data to be processed were seen as causing difficulties. Those firms that do not see aggregation as a major issue have deployed simpler application systems architectures, composed of fewer applications, says SimCorp. Obviously, this support’s the vendor’s own approach to the space with regards to its solutions offering. The vendor held a briefing and roundtable discussion on the morning of 9 July involving some of the people invited to participate in the study and Rawcliffe indicates that in that group there was consensus that data management is of high and growing importance. “The reasons why no doubt vary by organisation, but in SimCorp’s experience they invariably boil down to mitigating risk, controlling costs and enabling growth,” she says. This means that firms are more willing to spend in the data management space, despite the downturn in the financial markets; an issue that many other vendors have been discussing over recent months. Rawcliffe reckons that the data issue has become much more visible, especially in the area of risk management. “Anything which is likely to have a positive impact on risk mitigation or delivery of risk information, be it operational risk or a market risk like counterparty exposure, should help make a good business case to invest. Apart from risk, the other key driver at the moment is cost reduction, so likewise anything that has a positive impact on that should be enthusiastically received. As ever, business impact is the key to making the case for investment and if that impact is in current hotspots then so much the better,” she explains. With regards to the data aggregation challenges remaining, Rawcliffe indicates that there is still a long way to go. From the study it would seem like the biggest difficulties lie in the areas of the data itself (inconsistency in the way it is represented, how it is stored and where) and in other factors outside the direct control of those responsible for aggregation, she says. These factors include the timing of the availability of data and the necessary involvement of third parties in data delivery. “Generally factors under the control of those responsible for aggregation, such as organisation, operational processes and technology used, are not reckoned to be main challenges,” she adds. All of this means that the vendor community is in the throes of change, as it strives to more effectively meet customers’ needs. “Vendors need to – and do – respond to market demand by providing solutions to help asset management operations departments deliver information where and when it is needed. However, the solutions vary enormously. Some provide tools to duplicate data and move it around in ever larger quantities ever faster. Others provide tools to collect and collate, slice and dice, store and retrieve. Still others, such as SimCorp, offer more strategic solutions to simplify the asset management operation itself and thus reduce the need to move data around and aggregate it in the first place. All these approaches have a place and all continue to evolve. I expect the landscape in five years time will include all these elements,” she concludes.

Related content


Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...


Quantexa Innovates AML Monitoring with Contextual Decision Intelligence

Quantexa, a London-based data and analytics company, is innovating Anti-Money Laundering (AML) monitoring and investigation with the use of contextual decision intelligence (CDI), a means of enriching internal data with external data and building networks of relationships to create a contextual view of a customer. The company says complex financial crime in capital markets, including...


RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.


Risk & Compliance

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements. Data management is...