About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

UK Asset Managers Are Aware of Dangers of Not Aggregating Data But Standards Are Lacking, Says SimCorp

Subscribe to our newsletter

The UK asset management community is aware of the dangers posed by failing to adequately consolidate and aggregate their data but underlying problems remain, according to a recent report by risk management platform provider SimCorp and independent consultant Paul Miller. The problems are not to do with technology or organisation, explains Cath Rawcliffe, vice president of sales and marketing at SimCorp, but rather the lack of standardisation in the data space.

SimCorp decided to look into the data aggregation space in response to particular problems being experienced by its clients, says Rawcliffe. “Margins are being squeezed in the asset management sector as assets under management (AUM) and therefore revenues have fallen. Yet requirements for information have increased, especially in the area of risk,” she explains. “At SimCorp, we have encountered prospective customers who expend considerable time and resources aggregating data and who find their processes inflexible – some took literally weeks to determine counterparty exposures after the collapse of Lehman Brothers, for example, yet our own customers, on our single database platform, SimCorp Dimension, had this information in hours. So we thought it would be interesting to try both to understand the issue better and to get a feel for the extent of the problem,” Rawcliffe continues. According to the respondents to the study, most senior executives at UK asset management firms believe that their ability to undertake new business initiatives is hampered by the need for data aggregation, especially in the area of risk. They therefore recognise the need to pull together data to support key investment, control and reporting functions, but problems still remain. Rawcliffe reckons this recognition is an important factor for the future development of the data space. A large number of respondents to the survey have ‘high’ or ‘medium’ levels of aggregation process duplication across business functions. Most said their aggregation processes were ‘somewhat automated’ or better and nearly all firms indicated that they needed to perform aggregation processes at least daily. “It was a surprise though that some firms need to aggregate several times a day, says Rawcliffe. “We had to add an extra answer choice of ‘hourly’ to this question after the first two pilot interviews.” Data warehouses do not seem to have resolved the problem, says Rawcliffe, whether one has been deployed or not, spreadsheets and small database applications containing ‘private’ stores of data are found everywhere. Moreover, technology and organisation are not the main problems. “Rather, a lack of industry standards – or perhaps more accurately the plethora of so-called ‘standards’ – in the way data is defined, constraints around the timing of the availability of data (for example, output from upstream systems) and dependency on third parties (outsourced service providers and data vendors) are the main sources of difficulty,” she contends. The results of the study therefore indicate that majority of firms have a high dependency on spreadsheets and bespoke databases and even those that have deployed a data warehouse still need to store the results of aggregation processes in multiple locations. The vast majority of respondents also indicated that they feel that the number of systems they deploy could be reduced. “This is especially interesting, given our single database approach, in that it also emerged that those with simpler application landscapes tended to find aggregation less onerous,” says Rawcliffe. Data aggregation systems are proving to be the problem, according to the vendor. “These processes are creating a data straightjacket which stifles business flexibility,” says Rawcliffe. “It is evident that the more complex the operational platform, the more difficult it is to extract useful information from it. Yet even though the vast majority of firms participating in the study said that they could reduce the number of systems they have in use today, many continue to add further complexity in the form of more processes and data stores. This study indicates that the more this approach is followed, the more difficult it becomes to adapt as business requirements change.” Rawcliffe contends that while these systems appear to solve immediate problems, the continued introduction of new aggregation processes further entangles already complicated operations. “The study indicates that this constitutes a significant operational risk,” she adds. As well as the management of data, the timing of aggregation processes and reliance on third parties to either perform aggregation functions or deliver data to be processed were seen as causing difficulties. Those firms that do not see aggregation as a major issue have deployed simpler application systems architectures, composed of fewer applications, says SimCorp. Obviously, this support’s the vendor’s own approach to the space with regards to its solutions offering. The vendor held a briefing and roundtable discussion on the morning of 9 July involving some of the people invited to participate in the study and Rawcliffe indicates that in that group there was consensus that data management is of high and growing importance. “The reasons why no doubt vary by organisation, but in SimCorp’s experience they invariably boil down to mitigating risk, controlling costs and enabling growth,” she says. This means that firms are more willing to spend in the data management space, despite the downturn in the financial markets; an issue that many other vendors have been discussing over recent months. Rawcliffe reckons that the data issue has become much more visible, especially in the area of risk management. “Anything which is likely to have a positive impact on risk mitigation or delivery of risk information, be it operational risk or a market risk like counterparty exposure, should help make a good business case to invest. Apart from risk, the other key driver at the moment is cost reduction, so likewise anything that has a positive impact on that should be enthusiastically received. As ever, business impact is the key to making the case for investment and if that impact is in current hotspots then so much the better,” she explains. With regards to the data aggregation challenges remaining, Rawcliffe indicates that there is still a long way to go. From the study it would seem like the biggest difficulties lie in the areas of the data itself (inconsistency in the way it is represented, how it is stored and where) and in other factors outside the direct control of those responsible for aggregation, she says. These factors include the timing of the availability of data and the necessary involvement of third parties in data delivery. “Generally factors under the control of those responsible for aggregation, such as organisation, operational processes and technology used, are not reckoned to be main challenges,” she adds. All of this means that the vendor community is in the throes of change, as it strives to more effectively meet customers’ needs. “Vendors need to – and do – respond to market demand by providing solutions to help asset management operations departments deliver information where and when it is needed. However, the solutions vary enormously. Some provide tools to duplicate data and move it around in ever larger quantities ever faster. Others provide tools to collect and collate, slice and dice, store and retrieve. Still others, such as SimCorp, offer more strategic solutions to simplify the asset management operation itself and thus reduce the need to move data around and aggregate it in the first place. All these approaches have a place and all continue to evolve. I expect the landscape in five years time will include all these elements,” she concludes.

Subscribe to our newsletter

Related content


Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...


Understanding the Value of Global Identifiers in the Fight Against Financial Crime

By Clare Rowley, Head of Business Operations, GLEIF. Money laundering and terrorist financing create significant systemic risks in the global financial system. The intricate webs spun by fraudsters and criminals to evade detection crisscross national borders and legal jurisdictions, commonly exploiting multiple financial institutions and legal entities. In today’s instant digital economy, this is exposing...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


Regulatory Data Handbook – Third Edition

Need to know all the essentials about the regulations impacting data management? Welcome to the third edition of our A-Team Regulatory Data Handbook which provides all the essentials about regulations impacting data management. A-Team’s series of Regulatory Data Handbooks are a great way to see at-a-glance: All the regulations that are impacting data management today...