Alveo, a provider of market and reference data integration and analytics solutions for financial services firms, has designed its technology to optimise data flows for business user self-service, and provides cloud-native data aggregation and data quality solutions to give clients access to trusted data while maximising their return on investment.
Martijn Groot, vice president of marketing and strategy at Alveo, says the current focus is on refining the company’s delivery mechanism based on a managed services model that extends its scope beyond data management.
“We are doing significantly more data provisioning into reports and business applications for users such as analysts and quants,” he says. “There is much more focus on the uninterrupted wiring of the data, cutting the cost of last-mile integration and providing out-of-the-box analytics. The technology has been completely refreshed over the past few years using open source components wherever possible.”
He acknowledges that acquiring a managed services mindset has been a challenging process. Five years ago, the company was focused on building technology – now its default engagement model is through managed services, including hosting and application management, which means monitoring applications and fixing elements that are not working well.
“In our services organisation we have gone through quite a shift from a professional services approach to a situation where we are responsible for the day-to-day operation of the solution and the conversation is less about feature functionality and more about APIs and SLAs,” says Groot.
Alveo works with a range of institutions including asset managers, banks (mostly wholesale banks), insurance companies, market infrastructure firms such as exchanges and clearing houses, and hedge funds.
In some cases, clients are looking to streamline back-office operations. Buy-side and active manager clients may be motivated by the need to increase their capability to onboard, prepare and process data sets, and provision their data scientists.
Groot observes that regulatory requirements, particularly Basel IV, remain a major supply driver, while ESG, specifically the Sustainable Finance Disclosure Regulation (SFDR), requires firms to ensure they have all the data they need to start reporting their first principal adverse impact (PAI) statement in June 2022 and cover the first PAI reference period from June 2023, when financial markets participants need to report PAI indicators for their portfolios for the first time.
“Many firms will have to acquire additional data sets and will need some sort of internal data hub to onboard this data and add their own judgement to it,” says Groot. “There are still a lot of gaps in ESG data coverage, so firms will need to acquire data from a number of sources and combine ratings, expert opinion and their own opinion, and apply internal analytics to come to a full picture.”
In terms of significant events at Alveo, the company recently announced an agreement with Cognizant to jointly offer an ESG data management solution that will enable firms to operationalise ESG data and integrate it into business processes for better portfolio management while ensuring regulatory compliance. “This partnership focuses on the need to onboard, prepare and provision ESG data to decision makers and stakeholders through the entire investment management process from research to asset allocation to client and regulatory reporting,” says Groot.
From a product development perspective, the company is working on extending its integration with third-party libraries and adding more commonly used libraries. Data integration is another area of development, building on a collaboration with FactSet to address buy-side clients’ ESG data integration requirements.
“We continue to extend the range of data sets we support ‘out of the box’ and add functionality to our user interface with more pre-packaged dashboards, enabling clients to set up and configure their own workflows to source, prepare and distribute data,” says Groot. Brokers and market makers tend to have specific requirements in terms of how they want to screen external data, ascertain its quality and derive data for portfolio testing, so data usage tracking has become increasingly important.
The data lineage functionality offered by Alveo enables users to track data back to the source and validate the quality checks that were done on the data. When combined with usage metrics, clients can see which sources are most valuable and identify those which are under-used or not used at all.
Looking down the line, Groot expects a lot of post-trade operations to move to a data-as-a-service model over the next few years. He also refers to growing interest in embedding artificial intelligence and machine learning in the decision making process, not just in research and exploration but also in operations. “Then there is the shift to the cloud, with more clients deploying on Google, Azure, AWS or Oracle cloud infrastructure,” he concludes. “Our products are cloud agnostic, containerised and make the most of the flexibility and scalability cloud infrastructure provides.”
Alveo Client Case Study: Aegon
Aegon’s core economic business revolves around asset liability management, which is based on an economic modelling framework. The economic scenario generator simulates future market dynamics to model liabilities and the asset classes that will best hedge them. It requires current and historical time series market data as input.
Prior to selecting Alveo’s platform, Aegon’s actuarial, risk, asset management and regulatory reporting teams had individual responsibility for sourcing, cleansing, validating and delivering financial data to downstream applications.
Aegon now has a global high-quality market data infrastructure that has increased speed and efficiency, and reduced risk in the pricing and valuation process for insurance assets and liabilities. This infrastructure forms the bedrock of all other applications within Aegon Global that handle market data or need data to model economic exposures.
Alveo has also enabled the company’s actuaries to synthetically model longer dated interest rate maturities and properly match deep out-of-the-money instruments with liabilities at the end of the term spectrum.
The infrastructure captures how and when data is being overwritten or modelled, introducing data lineage transparency desired by regulators and auditors.
“The sales process took around four months with implementation of first delivery to downstream systems completed within eight weeks,” explains Groot. “The project was overseen by the head of group risk methodology and approved by the chief risk officer and finance department.”
Subscribe to our newsletter