Baskar Ceri, Managing Director, NI India, Emerson, talks about AI trade-offs.
At COP30, global leaders reiterated a central theme: climate goals cannot be met unless emerging technologies, AI included, operate within sustainable energy boundaries. As discussions increasingly highlighted the carbon impact of digital infrastructure, AI surfaced as both a powerful enabler and a growing energy consumer.
Artificial Intelligence is widely regarded as a catalyst for efficiency, accelerating decision-making, automating workflows, improving productivity, and opening new frontiers for innovation. Yet behind this momentum lies a rapidly growing concern: energy consumption. With AI models expanding in scale and adoption rising across public infrastructure and enterprises, the question raised by climate negotiators and technology policymakers alike becomes even more urgent: do the performance gains justify the environmental cost?
The message from COP30 is clear: a sustainable AI future requires transparency, measurement, and accountability. It demands a rigorous, engineering-led testing approach – an area where India’s electronics and test & measurement ecosystem is rapidly strengthening. It requires visibility into energy, performance, and accuracy trade-offs at every layer of the AI stack, ensuring that AI innovation aligns with global climate commitments rather than undermining them.
The Sustainability Challenge Behind AI’s Rise
India is emerging as one of the fastest-growing AI markets in the world, but this growth is accompanied by substantial pressure on energy and infrastructure.
AI-led digital expansion is expected to drive an additional 500 MW of data-centre capacity in India over the next four years, according to a study by Avendus Capital. Globally, the situation is no different. A report by McKinsey estimates that annual data-centre capacity demand will grow by 19–22% per year till 2030, reaching 171–219 GW, compared with the current 60 GW. To close this gap, the world will need to build twice the total data-centre capacity created since 2000—within just a quarter of the time.
In India, the implications are already visible. The IEEFA reports that data centres currently consume 0.5% of the country’s total electricity, a share that could rise to 3% by 2030. Meanwhile, Mercom India notes that as of June 2023, data centres across India consumed 139 billion kWh annually, growing at 4.4% per year.
Despite holding nearly 20% of the world’s data, India has only 3% of global data-centre capacity, according to Deloitte. Yet its AI market is projected to reach USD 20–22 billion by 2027, at a rapid 30% CAGR, highlighting the urgent need for sustainable scaling.
Why Sustainability Must Be Built Into AI From Day One
A persistent misconception is that the energy cost of AI is dominated by training large models. But in practical deployments, it is inference, the process of generating outputs that consumes the majority of energy. Training happens once; inference happens millions of times, across devices, users, and queries.
This makes model choice extremely important. Research shows that smaller or task-specific models can often match the performance of large general-purpose models while consuming significantly less energy. Selecting the “right-sized” model is therefore not only an engineering decision but an environmental one. Every millisecond saved during inference compounds into large-scale energy savings as applications scale.
Testing and Validation: The Cornerstones of Sustainable AI
Building sustainable AI requires visibility into how systems perform under real-world constraints. Testing helps uncover the trade-offs hidden across the full technology stack—power draw under load, thermal behavior, latency, and hardware interaction.
Advanced optimisation techniques show promise but must be rigorously validated. For example, inference-time model switching allows systems to transition to lighter models when full accuracy is not essential, reducing power consumption without compromising user experience. Similarly, federated learning, which keeps data on local devices rather than transmitting it centrally can significantly reduce communication-related energy.
For Indian enterprises and policymakers, this makes independent test and measurement capabilities essential: reliable data, not assumptions, must guide scaling decisions, infrastructure investments, and model selection.
Hardware Matters—Not Just Software
As India scales its AI infrastructure, hardware limitations play an equally decisive role in sustainability.
Cooling is one of the biggest contributors to data-centre energy consumption. According to Mercom India, nearly 40% of a data centre’s total electricity is used for cooling alone. To counter this, operators are adopting advanced methods such as liquid immersion cooling and better airflow engineering. Several next-generation facilities in India now report PUE (Power Usage Effectiveness) scores as low as 1.4, as noted by Eco-Business—a significant improvement over older setups.
At the chip level, cloud platforms are deploying specialised accelerators designed specifically for AI inference. For instance, AWS Inferentia chips deliver substantially higher performance per watt compared to general-purpose CPUs or GPUs, lowering both operational costs and carbon impact.
A Call to Action: Building a Responsible AI Future
India has a unique opportunity to lead on responsible, sustainable AI—but only if sustainability becomes a core design principle, not an afterthought.
It starts with measurement. Every inference call, watt consumed, cooling cycle, and model change must be tracked to make trade-offs visible. Next comes policy: mandatory reporting of data-centre energy use, PUE, and carbon emissions can set clear benchmarks and push greener standards across the industry. This must be supported by investment in energy-efficient chips, renewable-powered infrastructure, and smarter cooling and workload optimisation. Developers, too, must prioritise intelligent model design—choosing smaller, task-optimised models where possible and validating performance in real-world conditions.
Ultimately, sustainable AI depends on one principle: you cannot optimize what you cannot measure and test. The future of AI won’t be defined by how large models grow, but by how well we balance performance with sustainability. A greener AI ecosystem isn’t just possible—it’s essential. And it starts with one principle: Don’t assume. Test.