About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

UK Asset Managers Are Aware of Dangers of Not Aggregating Data But Standards Are Lacking, Says SimCorp

Subscribe to our newsletter

The UK asset management community is aware of the dangers posed by failing to adequately consolidate and aggregate their data but underlying problems remain, according to a recent report by risk management platform provider SimCorp and independent consultant Paul Miller. The problems are not to do with technology or organisation, explains Cath Rawcliffe, vice president of sales and marketing at SimCorp, but rather the lack of standardisation in the data space.

SimCorp decided to look into the data aggregation space in response to particular problems being experienced by its clients, says Rawcliffe. “Margins are being squeezed in the asset management sector as assets under management (AUM) and therefore revenues have fallen. Yet requirements for information have increased, especially in the area of risk,” she explains. “At SimCorp, we have encountered prospective customers who expend considerable time and resources aggregating data and who find their processes inflexible – some took literally weeks to determine counterparty exposures after the collapse of Lehman Brothers, for example, yet our own customers, on our single database platform, SimCorp Dimension, had this information in hours. So we thought it would be interesting to try both to understand the issue better and to get a feel for the extent of the problem,” Rawcliffe continues. According to the respondents to the study, most senior executives at UK asset management firms believe that their ability to undertake new business initiatives is hampered by the need for data aggregation, especially in the area of risk. They therefore recognise the need to pull together data to support key investment, control and reporting functions, but problems still remain. Rawcliffe reckons this recognition is an important factor for the future development of the data space. A large number of respondents to the survey have ‘high’ or ‘medium’ levels of aggregation process duplication across business functions. Most said their aggregation processes were ‘somewhat automated’ or better and nearly all firms indicated that they needed to perform aggregation processes at least daily. “It was a surprise though that some firms need to aggregate several times a day, says Rawcliffe. “We had to add an extra answer choice of ‘hourly’ to this question after the first two pilot interviews.” Data warehouses do not seem to have resolved the problem, says Rawcliffe, whether one has been deployed or not, spreadsheets and small database applications containing ‘private’ stores of data are found everywhere. Moreover, technology and organisation are not the main problems. “Rather, a lack of industry standards – or perhaps more accurately the plethora of so-called ‘standards’ – in the way data is defined, constraints around the timing of the availability of data (for example, output from upstream systems) and dependency on third parties (outsourced service providers and data vendors) are the main sources of difficulty,” she contends. The results of the study therefore indicate that majority of firms have a high dependency on spreadsheets and bespoke databases and even those that have deployed a data warehouse still need to store the results of aggregation processes in multiple locations. The vast majority of respondents also indicated that they feel that the number of systems they deploy could be reduced. “This is especially interesting, given our single database approach, in that it also emerged that those with simpler application landscapes tended to find aggregation less onerous,” says Rawcliffe. Data aggregation systems are proving to be the problem, according to the vendor. “These processes are creating a data straightjacket which stifles business flexibility,” says Rawcliffe. “It is evident that the more complex the operational platform, the more difficult it is to extract useful information from it. Yet even though the vast majority of firms participating in the study said that they could reduce the number of systems they have in use today, many continue to add further complexity in the form of more processes and data stores. This study indicates that the more this approach is followed, the more difficult it becomes to adapt as business requirements change.” Rawcliffe contends that while these systems appear to solve immediate problems, the continued introduction of new aggregation processes further entangles already complicated operations. “The study indicates that this constitutes a significant operational risk,” she adds. As well as the management of data, the timing of aggregation processes and reliance on third parties to either perform aggregation functions or deliver data to be processed were seen as causing difficulties. Those firms that do not see aggregation as a major issue have deployed simpler application systems architectures, composed of fewer applications, says SimCorp. Obviously, this support’s the vendor’s own approach to the space with regards to its solutions offering. The vendor held a briefing and roundtable discussion on the morning of 9 July involving some of the people invited to participate in the study and Rawcliffe indicates that in that group there was consensus that data management is of high and growing importance. “The reasons why no doubt vary by organisation, but in SimCorp’s experience they invariably boil down to mitigating risk, controlling costs and enabling growth,” she says. This means that firms are more willing to spend in the data management space, despite the downturn in the financial markets; an issue that many other vendors have been discussing over recent months. Rawcliffe reckons that the data issue has become much more visible, especially in the area of risk management. “Anything which is likely to have a positive impact on risk mitigation or delivery of risk information, be it operational risk or a market risk like counterparty exposure, should help make a good business case to invest. Apart from risk, the other key driver at the moment is cost reduction, so likewise anything that has a positive impact on that should be enthusiastically received. As ever, business impact is the key to making the case for investment and if that impact is in current hotspots then so much the better,” she explains. With regards to the data aggregation challenges remaining, Rawcliffe indicates that there is still a long way to go. From the study it would seem like the biggest difficulties lie in the areas of the data itself (inconsistency in the way it is represented, how it is stored and where) and in other factors outside the direct control of those responsible for aggregation, she says. These factors include the timing of the availability of data and the necessary involvement of third parties in data delivery. “Generally factors under the control of those responsible for aggregation, such as organisation, operational processes and technology used, are not reckoned to be main challenges,” she adds. All of this means that the vendor community is in the throes of change, as it strives to more effectively meet customers’ needs. “Vendors need to – and do – respond to market demand by providing solutions to help asset management operations departments deliver information where and when it is needed. However, the solutions vary enormously. Some provide tools to duplicate data and move it around in ever larger quantities ever faster. Others provide tools to collect and collate, slice and dice, store and retrieve. Still others, such as SimCorp, offer more strategic solutions to simplify the asset management operation itself and thus reduce the need to move data around and aggregate it in the first place. All these approaches have a place and all continue to evolve. I expect the landscape in five years time will include all these elements,” she concludes.

Subscribe to our newsletter

Related content


Upcoming Webinar: Best practices for creating an effective data quality control framework

Date: 8 November 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework...


Know Your Customer Offers Company Data from Local Registries Across 123 Countries

Know Your Customer has released an expanded version of its Know your Customer/Know Your Business (KYC/KYB) solution that covers company data and official incorporation documents from 123 countries worldwide. The data can be consumed using either Know Your Customer’s user interface or a single API. The expanded service provides real-time access to local company registries...


Data Management Summit New York

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.


Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...