About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Crux Informatics Goes Live with Data Quality/Transformation Offerings as New CEO Freiberg Takes Helm

Subscribe to our newsletter

New York-based financial data delivery platform operator Crux Informatics has launched two new services, Crux Data Protect and Crux Wrangle, to help firms ensure data quality and transform datasets to meet internal requirements. The company has also hired a new CEO, Will Freiberg, who has replaced founder and ex-Thomson Reuters desktop head Philip Brittan, who stepped down in March.

The new leadership and product launches follow Crux’s raise at the beginning of the year of $36 million in new funding in a round including Citi, Goldman Sachs, Morgan Stanley and reference client Two Sigma. Freiberg most recently held leadership positions at cloud technology specialists D2IQ and Mesosphere as well as a series of advisory and board roles for tech firms in the San Francisco area. Co-founder Brittan remains with the company focused on various partnership and advisory initiatives.

Crux Data Protect allows users to monitor data quality and anomalies across all datasets on the Crux Platform, creating validations and defining thresholds that meet their data needs across a variety of different categories, including freshness, completeness, accuracy, formats and distributions. Crux Wrangle gives users the ability to transform data pipelines so data is presented exactly how they want it, and includes data transformation functions that allow users to format, fill, filter, calculate, combine, and reshape data any way they choose.

“This is effectively data quality-as-a-service, and data wrangling-as-a-service, something that can be implemented and managed by an operations team rather than an expensive coding team,” says Michael Rude, Head of Go to Market at Crux Informatics. “We already have a really good set of base cases that service 75% of our client demands. Over the course of this year, we’re going to be offering increasingly complex wrangles and data quality validations, such as joining multiple files, aggregating data, and pivoting it in such a way that clients can see a very specific view across a number of data sets.”

The two new services are designed to provide analytics-ready data, allowing users to focus on the data science rather than getting bogged down with data quality checks, normalization and transformation tasks. “Clients are increasingly asking for help with data quality and content validations, not just technical validations,” says Rude. “They need a data hub that provides not just data delivery, but analytics-ready data, personalised data, and data that’s ready to be ingested into applications for a range of use cases.”

Crux’s objective is to provide a unified approach to data engineering, says Rude. “CIOs and CTOs are increasingly looking to outsource data management, but you really cannot outsource and leverage a third-party application without capabilities like these.” Crux Data Protect and Crux Wrangle are now fully operational on any data pipeline delivered via Crux.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

23 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance....

BLOG

Enhancing Trader Efficiency: Interoperability, Automation, and the Path to Smarter Desktops

In the face of intensifying demands on trading desks, the pressure to streamline workflows and enhance productivity has never been greater. So how can trading firms modernise the trader desktop without creating disruption? What’s the real return on investment for low-code, no-code, and interoperability tools? And how do firms balance build-versus-buy decisions when designing next-generation...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Practicalities of Working with the Global LEI

This special report accompanies a webinar we held on the popular topic of The Practicalities of Working with the Global LEI, discussing the current thinking around best practices for entity identification and data management. You can register here to get immediate access to the Special Report.