Why India’s AI infrastructure strategy must look beyond data centers

/5 min read

ADVERTISEMENT

Notably, investment commitments in the data center industry in India is expected to exceed US$100 billion by 2027.
Why India’s AI infrastructure strategy must look beyond data centers
 Credits: Getty Images

Artificial Intelligence (AI) has been reshaping how we live, work, and solve some of India’s most pressing issues – from boosting agricultural productivity, improving healthcare diagnostics, to enabling financial inclusion in rural areas. According to NASSCOM, India’s AI market is projected to grow tremendously at an annualized rate of 25-35 percent from 2024-2027, reaching around US$17 billion by 2027.The Indian government has also approved approximately US$1.25 billion in 2024 to invest in AI projects, including the development of compute infrastructure and large language models.

Amidst all the growth projections and investments, what has been the center of attention is the expansion of data centers. Notably, investment commitments in the data center industry in India is expected to exceed US$100 billion by 2027, according to CBRE. Since the ChatGPT boom more than two years ago, it seems like we are hearing new data center developments in India almost every few months.

While there is no doubt that data centers form the backbone of India’s AI infrastructure ecosystem, it is important to that they should only present part of the nation’s broader AI infrastructure strategy. If India’s aim for AI is for it to be a catalyst for inclusive and sustainable growth, our country must focus on a holistic AI infrastructure strategy that encompasses not only data centers but including Personal Computers (PCs) and edge devices as gateways to distribute AI compute.

Fortune India Latest Edition is Out Now!

Read Now

The whole is greater than the sum of its parts – why data centers are not everything

Consider a farmer in a remote village in India. The introduction of an AI farming application (app) on your smartphone promises quick access to advice on crop management and pest control, tailored to specific locations and weather conditions. In theory, when working properly, this app should help produce better yield and reduce costs based on data-driven recommendations.

However, the effectiveness of most of these apps is limited in areas with poor or unreliable internet connectivity, which is common in many rural regions of India. Users may experience delays or be unable to access real-time advice if they cannot connect to the cloud-based systems that the apps are built on.

And even when there is connectivity, the fact that data is transmitted between where the data is generated – say in a village in Himanchal Pradesh – to where the data center is located for processing, which is often in bigger cities such as Hyderabad, means slower response time, more energy consumption, and increased cybersecurity risks when running the app.

The solution to supporting India’s AI ambition and infrastructure build-out is not simply building more data centers. More data centers can mean soaring electricity consumption and serious environmental impact due to India’s reliance on fossil fuels. A recent Deloitte report specifies that India will require an additional 40 to 45 TWh of electricity by 2030 to support AI-driven data centers, reflecting the growing power demands of the country’s expanding digital infrastructure.

In short, consider just these three things: first, the people in rural areas who may not have reliable connectivity; second, the limitations for real-time responsiveness when transmitting data from one location to the data center and back even when connectivity is present; third, the environmental impact and cost of running AI applications predominantly via data centers. To make AI truly inclusive and sustainable to use and run, India needs a distributed compute approach to AI. This means creating a balance between data centers, AI-capable PCs, and edge devices that can run AI applications locally on devices.

Why AI PCs and edge devices must be part of India’s AI infrastructure strategy

Running AI apps locally, be it on AI PCs or edge devices (e.g., Internet of Things devices in factories, vehicles, farms, etc.), can be faster, consume less energy, and better suited for environments where connectivity is poor or power supply is constrained than data centers.

Consider again the AI farming app mentioned earlier, but this time the app is powered locally on a smartphone or tablet with built-in AI chips, and can be used without internet connectivity.

Without the need for high-speed internet or centralized data center to power the app, not only does it mean that the AI farming app can be used anywhere and anytime relatively securely, but it can also mean it is more affordable to run. For the farmer, it means his or her device uses only the data he or she needs, consuming less energy by doing less compute, and reducing concerns about security breaches as data does not leave the device. For the business that operates the app, less reliance on the data center can mean it also costs less to run the app, plus the added benefit of better user experience.

While there are many similarities in terms of the benefits of AI PCs and edge devices, they are used differently. AI PCs are powerful, general-purpose devices best used for local, high-performance AI tasks and development. Whereas edge devices such as those in smart factories or self-driving vehicles, they are more specialized, distributed, and optimized for real-time, low power AI inference at the data source that often needs to operate in space constrained and challenging environments (i.e., high or low temperatures, dusty environments, etc.)

In a distributed compute approach for a nation, this means that data centers are used as centralized hubs for training large AI models, storing vast datasets, and running high-complexity analytics. They continue to serve as the backbone of national and enterprise-level AI workloads. AI PCs equipped with Neural Processing Units (NPUs) are used by professionals, researchers, and advanced users for local inference for on-device AI tasks, development, and some training. Then you have the edge devices in the periphery (e.g., smartphones, devices in factories, farms, etc.) handling real-time inference and automation.

Becoming a leader in AI

Many countries around the world are now racing to become one of the leaders of AI, and India has a real chance to become one as we have the talents, strong government support, and business environment.

As we continue this critical path of expanding the nation’s AI infrastructure, it is pivotal that we ensure the foundation that we build is accessible and sustainable for all. A distributed compute approach to powering AI for the nation is what we need, not a one-sided reliance on data centers.

Now that more and more people have a better understanding of AI, and how our lives will be increasingly powered by it, it is now time to shift the conversation beyond data centers.

Fortune India is now on WhatsApp! Get the latest updates from the world of business and economy delivered straight to your phone. Subscribe now.