Clearwater Analytics, a Software-as-a-service (Saas) buy-side data aggregation and portfolio reporting specialist, this week launches a new machine learning-based information extraction service.
The data extraction solution focuses on data aggregation and normalization, drilling into transactional data to create a service that automates the ingestion of many types of data which are traditionally manually entered.
The solution uses advanced AI techniques including natural language processing (NLP) and deep learning to identify key data elements in a variety of document types, then extracts the data and feeds it into Clearwater’s data aggregation engine to be reconciled.
“We are committed to providing our clients with the most accurate data possible for their reporting needs,” says Warren Barkley, Chief Technology Officer at Clearwater Analytics. “Machine learning-backed data extraction eliminates the need for manual intervention with unstructured data and allows our clients faster access to more accurate information.”
Barkley joined Clearwater Analytics as technology chief just a few months ago from Amazon Web Services, where he had a particular focus on cloud computing solutions as general manager in the AWS machine learning group. He replaced James Price, who recently moved over to take on the role of Chief Quality Officer.
Founded in 2004, the firm claims to report on more than $4 trillion in assets for clients including JP Morgan and Facebook. As it looks to evolve its Saas automated investment data aggregation, reconciliation, accounting and reporting platform, the new service will be an interesting addition.
But is it in time? According to sources, the private equity-owned Clearwater is currently exploring a sale that could be worth up to $2 trillion, including debt. Welsh, Carson, Anderson & Stowe, the buyout firm which acquired its majority stake in the business in 2016 for an undisclosed amount, has hired an investment bank to review strategic options for Clearwater, reported Reuters in early August.