About a-team Marketing Services

A-Team Insight Blogs

Snowflake Partnership Offers Access to Mistral AI Large Language Models

Subscribe to our newsletter

Snowflake has made a global partnership with Mistral AI, a provider of AI Large Language Model (LLM) solutions. The multi-year partnership, which includes a parallel investment in Mistral’s Series A from Snowflake Ventures, will give Snowflake customers access to Mistral AI’s newest and most powerful LLM, Mistral Large, which includes reasoning capacities, is proficient in code and mathematics, and is fluent in five languages – French, English, German, Spanish and Italian. It can also process hundreds of pages of documents in a single call.

In addition, Snowflake customers can gain access to Mixtral 8x7B, Mistral AI’s open source model, and Mistral 7B, Mistral AI’s first foundation model optimised for low latency with a low memory requirement and high throughput for its size. The models are available to customers in public preview as a part of Snowflake Cortex, Snowflake’s fully managed LLM and vector search service that enables organisations to accelerate analytics and quickly build AI apps securely with their enterprise data.

“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs on the market directly in the hands of customers, empowering every user to build cutting-edge, AI-powered apps with simplicity and scale,” says Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the trusted data foundation, we’re transforming how enterprises harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”

Arthur Mensch, CEO and co-founder of Mistral AI, adds: “With our models available in the Snowflake Data Cloud, we are able to further democratise AI so users can create more sophisticated AI apps that drive value at a global scale.”

Snowflake Cortex first announced support for industry-leading LLMs for specialised tasks such as sentiment analysis, translation, and summarisation, alongside foundation LLMs, starting with Meta AI’s Llama 2 model, for use cases including retrieval-augmented generation at Snowday 2023. The company is continuing to invest in generative AI efforts by partnering with Mistral AI and advancing the suite of foundation LLMs in Snowflake Cortex, providing organisations with a path to bring generative AI to every part of the business.

To deliver a serverless experience that makes AI accessible to a broad set of users, Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA to deliver a full stack accelerated computing platform that leverages NVIDIA Triton Inference Server among other tools.

With Snowflake Cortex LLM functions now in public preview, Snowflake users can leverage AI with their enterprise data to support a wide range of use cases. Users with SQL skills can leverage smaller LLMs to cost-effectively address specific tasks such as sentiment analysis, translation, and summarisation in seconds. For more complex use cases, Python developers can go from concept to full-stack AI apps such as chatbots in minutes, combining the power of foundation LLMs, including Mistral AI’s LLMs in Snowflake Cortex, with chat elements, which will be in public preview soon, within Streamlit in Snowflake.

View our full agenda and more details here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Informatica Extends GenAI Capabilities

Informatica, a provider of enterprise cloud data management, has extended its GenAI capabilities by embedding its CLAIRE GPT GenAI-driven assistant into its Intelligent Data Management Cloud (IDMC). The embed is designed to simplify and accelerate data management, and offer users natural language interaction with their data and the ability to create enterprise-ready GenAI applications. Informatica...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...