About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Databricks Extends Capabilities of Lakehouse Data and AI Platform

Subscribe to our newsletter

Databricks, provider of the Lakehouse data and AI platform, has extended the platform’s capabilities with the addition of advanced data warehousing and governance, data sharing innovations including an analytics marketplace and data clean rooms for data collaboration, automatic cost optimisation for ETL operations, and machine learning (ML) lifecycle improvements.

The company, founded by the creators of open source solutions Delta Lake, Apache Spark and MLflow, works across business sectors including financial services, where its customer base includes the likes of Nasdaq, ABN Amro, Schroders, FIS, and Swedbank.

“Our customers want to be able to do business intelligence, AI and machine learning on one platform, where their data already resides. Databricks Lakehouse Platform gives data teams all of this on a simple, open, and multi-cloud platform,” says Ali Ghodsi, co-founder and CEO at Databricks.

The company’s additional data warehousing capabilities include Databricks SQL Serverless, available in preview on AWS and providing fully managed elastic compute for improved performance at a lower cost; Photon, a query engine for lakehouse systems that will be made generally available on Databricks Workspaces in coming weeks; open source connectors for Go, Node.js, and Python, to make it simpler to access the lakehouse from operational applications; and Databricks SQL CLI, enabling developers and analysts to run queries directly from their local computers.

Data governance additions include Unity Catalog, which will be made generally available on AWS and Azure, and provides centralised governance for all data and AI assets, with built-in search and discovery, and automated lineage for all workloads.

The company’s marketplace for data and AI will be available later this year, providing a place to package and distribute data and analytics assets. Unlike pure data marketplaces, Databricks’ offering enables data providers to package and monetise assets such as data tables, files, machine learning models, notebooks and analytics dashboards. Cleanrooms, also available later this year, will provide a way to share and join data across organisations with a secure, hosted environment and no data replication required.

ML advancements include MLflow 2.0, which includes MLflow Pipelines that can handle the operational set up of ML for users. Instead of setting up orchestration of notebooks, users can define the elements of the pipeline in a configuration file and MLflow Pipelines manages execution automatically. Beyond MLflow, Databricks has added serverless model endpoints to directly support production model hosting, as well as model monitoring dashboards to analyse real-world model performance.

Delta Live Tables is an ETL framework using a simple, declarative approach to building data pipelines. Since its introduction earlier this year, Databricks has expanded the framework with a new performance optimisation layer designed to speed up execution and reduce the costs of ETL.

Ghodsi concludes: “These new capabilities are advancing our Lakehouse vision to make it faster and easier than ever before to maximise the value of data, both within and across companies.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Streamlining trading and investment processes with data standards and identifiers

3 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration....

BLOG

Data Quality Posing Obstacles to AI Adoption and Other Processes, say Reports

The rush to build artificial intelligence applications has hit a wall of poor quality data and data complexity that’s hindering them from taking advantage of the technology. Those barriers are also preventing firms from upgrading other parts of their tech stacks. A slew of surveys and comments by researchers and vendors paint a picture of...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...