As companies move from experimenting with AI to using it at scale, 2026 will be about reliable workflows, AI literacy as a job must, and human judgment becoming more valuable than pure technical skills

In 2025, enterprises focussed on driving ROI with their AI investments, and agentic AI played a pivotal role in accelerating innovation. The year ahead will focus on enterprises embedding AI within various facets of their workstream and empowering their employees with AI tools that will assist and augment their work.
The ‘Wild West’ of agents ends; the era of fixed workflows begins
While 2025 was defined by excitement for agentic AI, 2026 will be defined by control and reliability. We are moving towards ‘fixed workflows,’ where AI is given a strict, governed path for critical operations. This is important because the real world has different kinds of consequences for mistakes. My team helped me assemble emails for Davos, and in one, I got a company name wrong. Super embarrassing, but I could apologise. But, if you’re using an agent for MRI screening for cancer, the consequence is vastly worse. The metric for success will shift from the novelty of an agent’s autonomy to the reliability of its output in high-stakes environments, with human-in-the-loop approvals and fail-safes built in.
The death of the ‘query writer’ and the rise of the ‘semantic architect’
The traditional role of the data analyst—spending hours writing mundane SQL queries— is becoming obsolete. By 2026, the highest-value work for analysts will shift entirely to defining data semantics: setting the definitions and context that allow AI agents to serve the rest of the organization 24/7. I saw this firsthand at TS Imagine, an asset manager, where the CIO told us that instead of three people working 9-to-5, he now has an intelligence system available 24/7. His analysts can focus on higher-value, open-ended analysis they would never have had time to do before. This evolution turns the analyst from a bottleneck into an enabler, building the semantic infrastructure that allows non-technical employees to answer complex questions without learning code.
AI literacy will become a binary job requirement, not a nice-to-have
By 2026, using AI tools will be as fundamental as using a computer. If you are a solutions engineer and cannot use a coding agent to deploy a demo faster, I would question your role. Increasingly, knowing how to use AI tools is a job requirement and
not a matter of subtle feedback. The difference between somebody who is rapidly adopting these tools versus someone who doesn’t really want to deal with the change becomes very stark, and I think those are the kinds of things that are going to be hard as they ripple through the workplace.
Meeting prep will compress from hours to 90 seconds
Pretty much before every customer meeting, you need information: What’s the history of the account? What recent use cases are we working on together? What recent meetings did we have? Are there any outstanding support cases I should know about? All of this used to take hours of human effort writing a two-page brief, and I’d have to read through and remember 20 of them before going to a conference. Now, using an AI tool, it’s literally a minute and a half to get all of this information to me and I can just look it up in real time.
‘Taste’ will replace technical proficiency as the most valuable skill
We look for software engineers and solution engineers that are comfortable with having AI tools assist them in making progress while simultaneously demonstrating taste. The thing that AI coding agents, for example, don’t yet give you is a taste about how you should structure your code, or how a particular architecture works, so it is that combination of wisdom and proficiency that we most look for. This is the shift happening across the industry: technical capability will be less about syntax and more about high-level intellectual intent. Can you make good architectural decisions? Do you understand when code is elegant versus when it’s just functional? AI can write the code, but it can’t yet tell you if you’re building the right thing in the right way. That judgment—that taste—is what separates great engineers from adequate ones in 2026.
(The author is Chief Executive Officer, Snowflake. Views are personal.)