Raw models hallucinate, yet the AI industry won't talk about it: Snowflake CEO Ramaswamy

/5 min read

ADVERTISEMENT

As AI reshapes industries, Snowflake CEO Sridhar Ramaswamy outlines the path forward in a changing SaaS and data ecosystem.
Raw models hallucinate, yet the AI industry won't talk about it: Snowflake CEO Ramaswamy
 Credits: Padmini B

The world of data and artificial intelligence is at a turning point, with transformative disruptions redefining the way businesses operate and innovate. Snowflake CEO Sridhar Ramaswamy offers a compelling perspective on these shifts, emphasising the interplay of AI's revolutionary potential, the rise of interoperable data formats, and the evolving SaaS landscape. In an interview with Fortune India, Ramaswamy highlights the broader implications of these changes, from the economic factors influencing SaaS valuations to the role of AI in creating unprecedented efficiencies for businesses. Here are some edited excerpts from the interaction.

How do you see the transition with AI playing out? There’s still a lot of doubt about whether it will become a significant revenue driver. What’s the value proposition? Post-pandemic, SaaS multiples rose dramatically but have since come down. So, in that context, is AI an enhancer or a leveller?  

Fortune India Latest Edition is Out Now!

Read Now

You’ve packed a tumultuous decade of questions into one, so let’s unpack it. First, let’s address SaaS multiples. Many SaaS multiples were high because we lived in what’s known as a zero-interest-rate policy (ZIRP) environment. SaaS companies were seen as reliable growth leaders with subscription models, that were less likely to face sudden disruptions. That stability drove high valuations. 

However, today’s environment is very different. A 5% interest rate is vastly different from 0%, leading to multiple compressions. While SaaS multiples are somewhat recovering, there are lingering questions about their sustainability in this new economic context. 

Now, regarding data companies and AI: I believe they’re facing two significant disruptions, not just one. The first is the rise of truly interoperable data formats. It might sound arcane, but it’s a game-changer. Think back to the VHS-Betamax wars. Today, those format wars in data storage have largely been resolved, with the industry coalescing around a format like Iceberg. The details aren’t as important as the broader point — there’s now industry-wide agreement on how data should be stored. This is akin to a common language for data, and it’s driving substantial changes in the data ecosystem. 

At Snowflake, we used to operate with a proprietary data format. Now, many of our customers demand interoperable formats, which represents both an opportunity and a disruption. While we can no longer rely on exclusive formats as a competitive edge, this shift allows us to tap into the vast amounts of data in cloud storage that isn’t already within Snowflake. It’s an equaliser, but one we view as a significant opportunity to lean into and evolve. 

The second disruption is AI itself. We’ve reached a point where AI has already demonstrated a clear "before and after" impact, much like the internet or mobile revolutions. There will always be cycles of hype and scepticism, but the simplest examples show how transformative AI already is. 

For instance, we work with some of the largest insurance companies, some even from India. Five or six years ago, the idea of digitising enormous volumes of claims, invoices, or PDFs into structured data was painfully daunting — bespoke and inefficient. Today, AI makes that possible in a way that’s effective, even if not perfect. AI can index large datasets and enable queries with unprecedented ease. 

Let’s consider a simpler example: sentiment detection. If you, as Fortune, receive extensive customer feedback, AI can now quickly flag if there’s a sudden surge of dissatisfaction. Tasks like these, which once required custom systems, are now solvable with existing AI tools. The improvement in AI capabilities — such as ChatGPT’s ability to transcribe and translate spoken language in English or any other language compared to tools like Siri — is night and day. 

I believe it will take five to seven years for AI to fully integrate into existing software systems, but the shift is undeniable. One major change is how AI converts fluid language into structured data. Imagine no longer having to type out search queries, book tickets, or estimate travel times manually — AI can automate all of that. 

But can AI do everything?  

Absolutely not. We don’t know the limits. And is it perfect? It’s not. Raw models are very prone to hallucination, and yet the AI industry won’t talk about this. They won’t tell you; they won’t even help you figure out when it’s hallucinating. This is part of the frustration: you use ChatGPT, and while many answers are perfect, quite a few are completely off. But it becomes your problem. 

These are precisely the issues we’re trying to avoid with Snowflake. When we build products, we bring an engineer’s mindset. How do you demonstrate that the product is doing the right thing? What proof can you provide? How do you design systems that improve over time? 

I believe that as people learn how to properly engineer AI, it will become more and more broadly applicable. But what AI can already do today is nothing short of remarkable. 

Like you said, in Generative AI, it hallucinates — you know it. But in data, if that happens, it’s a big, big problem. 

That’s why it’s critical to design robust software systems. 

So, it’s a cost, right? And when you offer that to a customer, the question is whether they’re willing to pay top dollar on top of what you’re already charging them. 

I see it differently — it’s another capability, not something that requires charging top dollar. Snowflake operates on a consumption model. We don’t sell individual licenses. Unlike companies with subscription-based products that may charge 30% more for AI features, forcing customers to make that choice, we take a different approach. 

What I tell our customers is this: we’ve developed great tools and engineered them to be useful. Please only pursue projects that create value for you. If something is useful but isn’t delivering value, come talk to us — we don’t want you to deploy it. This way, the conversation becomes much more rational. 

For example, I tell customers that developing a chatbot on Snowflake, with a reasonable amount of data, might cost ₹2,000 to ₹10,000. It’s not much to create a prototype. Then, based on the results, you can decide whether to scale it for larger deployments. We aim to make the process of creating and deploying projects as rational and value-driven as possible. 

Do you foresee more AI-related acquisitions? 

We are always open to acquisitions. Historically, we haven’t pursued large revenue-driven acquisitions but… 

…for capability? 

Yes, exactly. Revenue acquisitions are tricky for several structural reasons. First, if a company is already generating significant revenue, you don’t want it to lower your overall growth rate. Finding companies that are both generating meaningful revenue — say, over $100 million — and growing at over 30% is challenging. That’s a high hurdle to clear. 

Second, there’s the issue of justifying valuation multiples. Venture capitalists often value startups at 40–50 times forward revenue. Meanwhile, Snowflake is typically valued at 11–12 times forward revenue. Bridging that gap is difficult unless there's a clear “one plus one equals four” story. These structural challenges make large acquisitions complicated. 

That said, we remain open-minded. I believe open data formats and AI will drive consolidation in what is currently a fragmented data industry. Right now, there are specialists in every layer: ingestion tools like Redpanda and Confluent, data engineering platforms like Databricks, and players like Cloudera and Trino in other spaces. Beyond that, there are specialists in analytics, catalogues, and semantic layers. 

This fragmentation creates inefficiencies and customer challenges, which is why I see consolidation as inevitable. It will likely open up some very interesting M&A opportunities soon. 

When do you expect the consolidation to eventually play out?  

I think it'll happen in the next three to four years. So, I think having a best-of-breed in every layer is problematic. I've talked to many of our customers. For example, there's a bank that's spending a billion plus on data every year and they tell me 700 million of that is mostly in people who are stitching together various tools to make things work. There's only $300 million going into software licences. And their challenge was like, listen, why don't you make this easier? So, we are not spending so much time stitching things together. So, I do think that there's a consolidation coming. Also, even the changes in administration in the US are very different before and after.

Fortune India is now on WhatsApp! Get the latest updates from the world of business and economy delivered straight to your phone. Subscribe now.