About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Bloomberg Unveils BloombergGPT: A Large-Language Model Tailored for the Financial Industry

Subscribe to our newsletter

Bloomberg has introduced BloombergGPT, a generative artificial intelligence (AI) model specifically designed to enhance natural language processing (NLP) tasks within the financial sector. Developed using a vast range of financial data, this large language model (LLM) represents a significant step forward in the application of AI technology in the financial industry.

While recent advancements in AI and LLMs have generated promising new applications across various domains, the financial sector’s complexity and unique terminology necessitate a bespoke model. BloombergGPT will help improve existing financial NLP tasks such as sentiment analysis, named entity recognition, news classification, and question answering. Moreover, the model will unlock new possibilities for efficiently utilising the extensive data available on the Bloomberg Terminal, ensuring that customers reap the full benefits of AI in the financial realm.

As a pioneer in AI, machine learning, and NLP applications in finance for over a decade, Bloomberg now supports an extensive range of NLP tasks that stand to gain from a finance-aware language model. The company’s researchers employed a mixed approach, incorporating finance data with general-purpose datasets, to train a model that excels in financial benchmarks while maintaining competitive performance in general-purpose LLM benchmarks.

“For all the reasons generative LLMs are attractive – few-shot learning, text generation, conversational systems, etc. – we see tremendous value in having developed the first LLM focused on the financial domain,” commented Shawn Edwards, Bloomberg’s Chief Technology Officer. “BloombergGPT will enable us to tackle many new types of applications, while it delivers much higher performance out-of-the-box than custom models for each application, at a faster time-to-market.”

Bloomberg’s ML Product and Research group joined forces with the AI Engineering team to build one of the largest domain-specific datasets to date, leveraging the company’s existing data creation, collection, and curation resources. Bloomberg’s data analysts have been amassing and managing financial language documents for forty years, and the team utilised this extensive archive to create a comprehensive dataset of 363 billion English-language financial tokens.

The team supplemented this data with a 345 billion token public dataset, resulting in a training corpus exceeding 700 billion tokens. They then trained a 50-billion parameter decoder-only causal language model using part of this corpus. Validated on existing finance-specific NLP benchmarks, Bloomberg internal benchmarks, and general-purpose NLP tasks from well-known benchmarks, the BloombergGPT model surpasses comparable open models in financial tasks by considerable margins while matching or exceeding performance in general NLP benchmarks.

“The quality of machine learning and NLP models comes down to the data you put into them,” said Gideon Mann, Head of Bloomberg’s ML Product and Research team. “Thanks to the collection of financial documents Bloomberg has curated over four decades, we were able to carefully create a large and clean, domain-specific dataset to train a LLM that is best suited for financial use cases. We’re excited to use BloombergGPT to improve existing NLP workflows, while also imagining new ways to put this model to work to delight our customers.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

The Digital Last Mile: Reimagining Trader Workflows for the AI Era

As trading desks evolve, firms are turning to AI, interoperability, and modern technology platforms to create smarter, more connected environments that reduce friction for traders, enhance decision-making, and allow users to interact seamlessly with data and tools. However, one of the biggest hurdles they face is the sheer weight of the past. Decades of bespoke...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...