Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
Snowflake is all set to deploy powerful language models for complex data work. Today, the company announced it is launching Cortex Analyst, an all-new agentic AI system for self-service analytics, in public preview.
First announced during the company’s data cloud summit in June, Cortex Analyst is a fully managed service that provides businesses with a conversational interface to talk to their data. All the users have to do is ask business questions in plain English and the agentic AI system handles the rest, right from converting the prompts into SQL and querying the data to running checks and providing the required answers.
Snowflake’s head of AI Baris Gultekin tells VentureBeat that the offering uses a combination of multiple large language model (LLM) agents that work in tandem to ensure insights are delivered with an accuracy of about 90%. He claims this is far better than the accuracy of existing LLM-powered text-to-SQL offerings, including that of Databricks, and can easily accelerate analytics workflows, giving business users instant access to the insights they need for making critical decisions.
Simplifying analytics with Cortex Analyst
Even as enterprises continue to double down on AI-powered generation and forecasting, data analytics continues to play a transformative role in business success. Organizations extract valuable insights from historical structured data – organized in the form of tables – to make decisions across domains such as marketing and sales.
However, the thing is, currently, the entire ecosystem of analytics is largely driven by business intelligence (BI) dashboards that use charts, graphs and maps to visualize data and provide information. The approach works well but can also prove quite rigid at times, with users struggling to drill deeper into specific metrics and depending on often-overwhelmed analysts for follow-up insights.
“When you have a dashboard and you see something wrong, you immediately follow with three different questions to understand what’s happening. When you ask these questions, an analyst will come in, do the analysis and deliver the answer within a week or so. But, then, you may have more follow-up questions, which may keep the analytics loop open and slow down the decision-making process,” Gultekin said.
To solve this gap, many started exploring the potential of large language models that have been great at unlocking insights from unstructured data (think long PDFs). The idea was to pass raw structured data schema through the models so that they could power a text-to-SQL-based conversational experience, allowing users to instantly talk to their data and ask relevant business questions.
However, as these LLM-powered offerings appeared, Snowflake noted one major problem – low accuracy. According to the company’s internal benchmarks representative of real-world use cases, when using state-of-the-art models like GPT-4o directly, the accuracy of analytical insights stood at about 51%, while dedicated text-to-SQL sections, including Databricks’ Genie, led to 79% accuracy.
“When you’re asking business questions, accuracy is the most important thing. Fifty-one percent accuracy is not acceptable. We were able to almost double that to about 90% by tapping a series of large language models working closely together (for Cortex Analyst),” Gultekin noted.
When integrated into an enterprise application, Cortex Analyst takes in business queries in natural language and passes them through LLM agents sitting at different levels to come up with accurate, hallucination-free answers, grounded in the enterprises’ data in the Snowflake data cloud. These agents handle different tasks, right from analyzing the intent of the question and determining if it can be answered to generating and running the SQL query from it and checking the correctness of the answer before it is returned to the user.
“We’ve built systems that understand if the question is something that can be answered or ambiguous and cannot be answered with accessible data. If the question is ambiguous, we ask the user to restate and provide suggestions. Only after we know the question can be answered by the large language model, we pass it ahead to a series of LLMs, agentic models that generate SQL, reason about whether that SQL is correct, fix the incorrect SQL and then run that SQL to deliver the answer,” Gultekin explains.
The AI head did not share the exact specifics of the models powering Cortex Analyst but Snowflake has confirmed it is using a combination of its own Arctic model as well as those from Mistral and Meta.
How exactly does it work?
To ensure the LLM agents behind Cortex Analyst understand the complete schema of a user’s data structure and provide accurate, context-aware responses, the company requires customers to provide semantic descriptions of their data assets during the setup phase. This fills a major problem associated with raw schemas and enables the models to capture the intent of the question, including the user’s vocabulary and specific jargon.
“In real-world applications, you have tens of thousands of tables and hundreds of thousands of columns with strange names. For example, ‘Rev 1 and Rev 2’ could be iterations of what might mean revenue. Our customers can specify these metrics and their meaning in the semantic descriptions, enabling the system to use them when providing answers,” Gultekin added.
As of now, the company is providing access to Cortex Analyst as a REST API that can be integrated into any application, giving developers the flexibility to tailor how and where their business users tap the service and interact with the results. There’s also the option of using Streamlit to build dedicated apps using Cortex Analyst as the central engine.
In the private preview, about 40-50 enterprises, including pharmaceutical giant Bayer, deployed Cortex Analyst to talk to their data and accelerate analytical workflows. The public preview is expected to increase this number, especially as enterprises continue to focus on adopting LLMs without breaking their banks. The service will give companies the power of LLMs for analytics, without actually going through all the implementation hassle and cost overhead.
Snowflake also confirmed it will get more features in the coming days, including support for multi-turn conversations for an interactive experience and more complex tables and schemas.