About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Snowflake Partnership Offers Access to Mistral AI Large Language Models

Subscribe to our newsletter

Snowflake has made a global partnership with Mistral AI, a provider of AI Large Language Model (LLM) solutions. The multi-year partnership, which includes a parallel investment in Mistral’s Series A from Snowflake Ventures, will give Snowflake customers access to Mistral AI’s newest and most powerful LLM, Mistral Large, which includes reasoning capacities, is proficient in code and mathematics, and is fluent in five languages – French, English, German, Spanish and Italian. It can also process hundreds of pages of documents in a single call.

In addition, Snowflake customers can gain access to Mixtral 8x7B, Mistral AI’s open source model, and Mistral 7B, Mistral AI’s first foundation model optimised for low latency with a low memory requirement and high throughput for its size. The models are available to customers in public preview as a part of Snowflake Cortex, Snowflake’s fully managed LLM and vector search service that enables organisations to accelerate analytics and quickly build AI apps securely with their enterprise data.

“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs on the market directly in the hands of customers, empowering every user to build cutting-edge, AI-powered apps with simplicity and scale,” says Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the trusted data foundation, we’re transforming how enterprises harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”

Arthur Mensch, CEO and co-founder of Mistral AI, adds: “With our models available in the Snowflake Data Cloud, we are able to further democratise AI so users can create more sophisticated AI apps that drive value at a global scale.”

Snowflake Cortex first announced support for industry-leading LLMs for specialised tasks such as sentiment analysis, translation, and summarisation, alongside foundation LLMs, starting with Meta AI’s Llama 2 model, for use cases including retrieval-augmented generation at Snowday 2023. The company is continuing to invest in generative AI efforts by partnering with Mistral AI and advancing the suite of foundation LLMs in Snowflake Cortex, providing organisations with a path to bring generative AI to every part of the business.

To deliver a serverless experience that makes AI accessible to a broad set of users, Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA to deliver a full stack accelerated computing platform that leverages NVIDIA Triton Inference Server among other tools.

With Snowflake Cortex LLM functions now in public preview, Snowflake users can leverage AI with their enterprise data to support a wide range of use cases. Users with SQL skills can leverage smaller LLMs to cost-effectively address specific tasks such as sentiment analysis, translation, and summarisation in seconds. For more complex use cases, Python developers can go from concept to full-stack AI apps such as chatbots in minutes, combining the power of foundation LLMs, including Mistral AI’s LLMs in Snowflake Cortex, with chat elements, which will be in public preview soon, within Streamlit in Snowflake.

View our full agenda and more details here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

Anthropic’s Financial Industry Claude Iteration Aimed at Easing AI Adoption

Large language model (LLM) builder Anthropic may have the solution to assuaging financial institutions’ doubts about generative artificial intelligence deployment in their analytics and decision-making workflows, having created a model that has been designed specifically for the industry. Claude for Financial Services, part of the San Francisco-based company’s Claude for Enterprise suite, comprises capabilities that...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2014

Welcome to the inaugural edition of the A-Team Regulatory Data Handbook. We trust you’ll find this guide a useful addition to the resources at your disposal as you navigate the maze of emerging regulations that are making ever more strenuous reporting demands on financial institutions everywhere. In putting the Handbook together, our rationale has been...