About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Databricks Extends Capabilities of Lakehouse Data and AI Platform

Subscribe to our newsletter

Databricks, provider of the Lakehouse data and AI platform, has extended the platform’s capabilities with the addition of advanced data warehousing and governance, data sharing innovations including an analytics marketplace and data clean rooms for data collaboration, automatic cost optimisation for ETL operations, and machine learning (ML) lifecycle improvements.

The company, founded by the creators of open source solutions Delta Lake, Apache Spark and MLflow, works across business sectors including financial services, where its customer base includes the likes of Nasdaq, ABN Amro, Schroders, FIS, and Swedbank.

“Our customers want to be able to do business intelligence, AI and machine learning on one platform, where their data already resides. Databricks Lakehouse Platform gives data teams all of this on a simple, open, and multi-cloud platform,” says Ali Ghodsi, co-founder and CEO at Databricks.

The company’s additional data warehousing capabilities include Databricks SQL Serverless, available in preview on AWS and providing fully managed elastic compute for improved performance at a lower cost; Photon, a query engine for lakehouse systems that will be made generally available on Databricks Workspaces in coming weeks; open source connectors for Go, Node.js, and Python, to make it simpler to access the lakehouse from operational applications; and Databricks SQL CLI, enabling developers and analysts to run queries directly from their local computers.

Data governance additions include Unity Catalog, which will be made generally available on AWS and Azure, and provides centralised governance for all data and AI assets, with built-in search and discovery, and automated lineage for all workloads.

The company’s marketplace for data and AI will be available later this year, providing a place to package and distribute data and analytics assets. Unlike pure data marketplaces, Databricks’ offering enables data providers to package and monetise assets such as data tables, files, machine learning models, notebooks and analytics dashboards. Cleanrooms, also available later this year, will provide a way to share and join data across organisations with a secure, hosted environment and no data replication required.

ML advancements include MLflow 2.0, which includes MLflow Pipelines that can handle the operational set up of ML for users. Instead of setting up orchestration of notebooks, users can define the elements of the pipeline in a configuration file and MLflow Pipelines manages execution automatically. Beyond MLflow, Databricks has added serverless model endpoints to directly support production model hosting, as well as model monitoring dashboards to analyse real-world model performance.

Delta Live Tables is an ETL framework using a simple, declarative approach to building data pipelines. Since its introduction earlier this year, Databricks has expanded the framework with a new performance optimisation layer designed to speed up execution and reduce the costs of ETL.

Ghodsi concludes: “These new capabilities are advancing our Lakehouse vision to make it faster and easier than ever before to maximise the value of data, both within and across companies.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Private Market GPs Unhappy with Data Availability, Unsure About AI Value

General partners (GPs) within the private equity space are dissatisfied with the quality of data available to them at a time when technology is rapidly transforming the sector. Despite being upbeat about the prospects for deals in the coming year, fund allocators are more pessimistic about their operations amid a shortage of digital operational and...

EVENT

TEST Event page 2

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...