About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Snowflake Partnership Offers Access to Mistral AI Large Language Models

Subscribe to our newsletter

Snowflake has made a global partnership with Mistral AI, a provider of AI Large Language Model (LLM) solutions. The multi-year partnership, which includes a parallel investment in Mistral’s Series A from Snowflake Ventures, will give Snowflake customers access to Mistral AI’s newest and most powerful LLM, Mistral Large, which includes reasoning capacities, is proficient in code and mathematics, and is fluent in five languages – French, English, German, Spanish and Italian. It can also process hundreds of pages of documents in a single call.

In addition, Snowflake customers can gain access to Mixtral 8x7B, Mistral AI’s open source model, and Mistral 7B, Mistral AI’s first foundation model optimised for low latency with a low memory requirement and high throughput for its size. The models are available to customers in public preview as a part of Snowflake Cortex, Snowflake’s fully managed LLM and vector search service that enables organisations to accelerate analytics and quickly build AI apps securely with their enterprise data.

“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs on the market directly in the hands of customers, empowering every user to build cutting-edge, AI-powered apps with simplicity and scale,” says Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the trusted data foundation, we’re transforming how enterprises harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”

Arthur Mensch, CEO and co-founder of Mistral AI, adds: “With our models available in the Snowflake Data Cloud, we are able to further democratise AI so users can create more sophisticated AI apps that drive value at a global scale.”

Snowflake Cortex first announced support for industry-leading LLMs for specialised tasks such as sentiment analysis, translation, and summarisation, alongside foundation LLMs, starting with Meta AI’s Llama 2 model, for use cases including retrieval-augmented generation at Snowday 2023. The company is continuing to invest in generative AI efforts by partnering with Mistral AI and advancing the suite of foundation LLMs in Snowflake Cortex, providing organisations with a path to bring generative AI to every part of the business.

To deliver a serverless experience that makes AI accessible to a broad set of users, Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA to deliver a full stack accelerated computing platform that leverages NVIDIA Triton Inference Server among other tools.

With Snowflake Cortex LLM functions now in public preview, Snowflake users can leverage AI with their enterprise data to support a wide range of use cases. Users with SQL skills can leverage smaller LLMs to cost-effectively address specific tasks such as sentiment analysis, translation, and summarisation in seconds. For more complex use cases, Python developers can go from concept to full-stack AI apps such as chatbots in minutes, combining the power of foundation LLMs, including Mistral AI’s LLMs in Snowflake Cortex, with chat elements, which will be in public preview soon, within Streamlit in Snowflake.

View our full agenda and more details here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

Bloomberg Debuts Real-Time Events Data Feed

Bloomberg has broken new ground with the release of its Real-time Events Data solution, which it says will help financial institutions make better decisions faster, based on the most accurate and timely information. The US financial data and technology behemoth has leveraged its real-time streaming API connectivity to provide subscribing clients with data from earnings...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...