About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Snowflake Cortex Simplifies Route to Deriving Value from Generative AI

Subscribe to our newsletter

Snowflake has unveiled Snowflake Cortex, an innovative managed service designed to simplify how organisations derive value from generative AI.

The service provides access to large language models (LLMs), AI models, and vector search functionality in the Snowflake Data Cloud, and includes serverless functions that help users accelerate analytics and build contextualised LLM-powered apps within minutes, as well as LLM-powered experiences that drive productivity in the data cloud.

“Snowflake is providing enterprises with the data foundation and cutting-edge AI building blocks they need to create powerful AI and machine learning apps while keeping their data safe and governed,” says Sridhar Ramaswamy, senior vice president of AI at Snowflake. “With Snowflake Cortex, businesses can tap into the power of large language models in seconds, build custom LLM-powered apps within minutes, and maintain flexibility and control over their data, while reimagining how users tap into generative AI to deliver business value.”

The serverless functions provide instant access to LLMs such as Meta AI’s Llama 2 model, task-specific models, and advanced vector search functionality. They are available through a function call in SQL or Python code and can be used by users of all skill sets to quickly analyse data or build AI apps running in the Snowflake Cortex infrastructure.

Snowflake has also built three LLM-powered experiences using Snowflake Cortex to enhance user productivity. Snowflake Copilot is an LLM-powered assistant that brings generative AI to everyday Snowflake coding tasks with natural language and allows users to ask questions of their data in plain text, write SQL queries against relevant data sets, refine queries and filter insights.

Universal Search is an LLM-powered functionality that enables users to quickly find and start getting value from the most relevant data and apps for their use cases, while Document AI uses LLMs to easily extract content like contractual terms from documents and fine-tune results using a visual interface and natural language.

Snowflake showcased these and other new solutions at its virtual Snowday today.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Data Automator Xceptor Offers Platform Ready-Made for AI

Dan Reid is not surprised that Xceptor, the data automation giant he formed two decades ago, finds itself at the vanguard of a change in the way financial institutions regard and use documents. The rapid and accurate parsing of information from paper- and PDF-based reports has been made possible thanks to recent developments in artificial intelligence. The volume...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

FATCA – The Time to Act is Now

The US Foreign Account Tax Compliance Act – aka FATCA – raised eyebrows when its final regulations requiring foreign financial institutions (FFIs) to report US accounts to US tax authorities were published last year. But with the exception of a few modifications, the legislation remains in place and starts to comes into force in earnest...