About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Databricks Extends Capabilities of Lakehouse Data and AI Platform

Subscribe to our newsletter

Databricks, provider of the Lakehouse data and AI platform, has extended the platform’s capabilities with the addition of advanced data warehousing and governance, data sharing innovations including an analytics marketplace and data clean rooms for data collaboration, automatic cost optimisation for ETL operations, and machine learning (ML) lifecycle improvements.

The company, founded by the creators of open source solutions Delta Lake, Apache Spark and MLflow, works across business sectors including financial services, where its customer base includes the likes of Nasdaq, ABN Amro, Schroders, FIS, and Swedbank.

“Our customers want to be able to do business intelligence, AI and machine learning on one platform, where their data already resides. Databricks Lakehouse Platform gives data teams all of this on a simple, open, and multi-cloud platform,” says Ali Ghodsi, co-founder and CEO at Databricks.

The company’s additional data warehousing capabilities include Databricks SQL Serverless, available in preview on AWS and providing fully managed elastic compute for improved performance at a lower cost; Photon, a query engine for lakehouse systems that will be made generally available on Databricks Workspaces in coming weeks; open source connectors for Go, Node.js, and Python, to make it simpler to access the lakehouse from operational applications; and Databricks SQL CLI, enabling developers and analysts to run queries directly from their local computers.

Data governance additions include Unity Catalog, which will be made generally available on AWS and Azure, and provides centralised governance for all data and AI assets, with built-in search and discovery, and automated lineage for all workloads.

The company’s marketplace for data and AI will be available later this year, providing a place to package and distribute data and analytics assets. Unlike pure data marketplaces, Databricks’ offering enables data providers to package and monetise assets such as data tables, files, machine learning models, notebooks and analytics dashboards. Cleanrooms, also available later this year, will provide a way to share and join data across organisations with a secure, hosted environment and no data replication required.

ML advancements include MLflow 2.0, which includes MLflow Pipelines that can handle the operational set up of ML for users. Instead of setting up orchestration of notebooks, users can define the elements of the pipeline in a configuration file and MLflow Pipelines manages execution automatically. Beyond MLflow, Databricks has added serverless model endpoints to directly support production model hosting, as well as model monitoring dashboards to analyse real-world model performance.

Delta Live Tables is an ETL framework using a simple, declarative approach to building data pipelines. Since its introduction earlier this year, Databricks has expanded the framework with a new performance optimisation layer designed to speed up execution and reduce the costs of ETL.

Ghodsi concludes: “These new capabilities are advancing our Lakehouse vision to make it faster and easier than ever before to maximise the value of data, both within and across companies.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Standards and Identifiers Help to Prevent ‘Data Chaos’: Webinar Preview

Financial institutions’ absorption of ever-greater volumes of data, and their utilisation of it in a surging number of use cases, is putting strains on their data management processes. Taking the friction out of those workflows can improve performance substantially. But the absence of a unified international set of standards to ensure all data used by...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...