While we embrace the bright sparks AI has brought into our lives, we also need to consider the environmental impact of using high-powered machines which are constantly capturing and processing information to make the algorithm intelligent
The excitement around Artificial Intelligence (AI) is palpable as intelligent systems take over increasing number of business and personal activities. From facial recognition, recommendation engines, customer care, managing critical infrastructure to intelligent refrigerators and entertainment systems, AI has permeated every aspect of our lives.
While we embrace the bright sparks AI has brought into our lives, we also need to consider the environmental impact of using high-powered machines which are constantly capturing and processing information to make the algorithm intelligent. A research paper published in the Journal of Parallel and Distributed Computing analyzed different approaches to energy consumption and Machine Learning (ML) in particular and found that sophisticated algorithms that deliver higher level of accuracy consumed more energy.
AI honed by ML and fed by Big Data requires massive storage and computational power. Given the volume and velocity of Big Data, there is a need for huge storage and powerful, scalable IT systems making it an ideal use case for Cloud computing as it allows customers to take advantage of distributed machines and provision resources on demand.
Cloud service providers are continuously expanding data centers to meet with the demand for data intensive applications, such as AI, social media, streaming video and financial analysis. Not surprisingly data centers are consuming massive amounts of electricity accounting for 1% of the world’s electricity in 2010.
According to the study published in Science, the amount of computing done in data centers increased six-fold between 2010 and 2018, yet the amount of energy consumed by the world’s data centers grew only 6% during that period, thanks to improvements in energy efficiency. This is reflected in the ongoing efforts by Cloud service providers, such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud in building energy-efficient data centers.
The report however warned that there is no guarantee the efficiency drive will continue in the face of data-hungry new technologies, such as AI and 5G.
So how are Cloud services faring in the quest for green goals? A quick look at the big three:
Microsoft: Microsoft’s Director of Energy Research, Sean James, explains in a company blog that companies looking to deploy Cloud technology in an environmentally-friendly way is better off collaborating with cloud providers as data centers are up to 93% more energy efficient and up to 98% more carbon efficient than traditional on premise operations.
Such efficiencies are possible due to policies that place sustainability at the forefront and continuous research in energy-efficient technologies. The blog states Microsoft is on track to achieve its ambition of powering its data centers with 100% renewable energy and by the end of this year, the company will meet its 2020 target of 60% energy via renewable energy and 70% by 2023.
AWS: AWS has a goal of meeting 100% of the energy needs of its global infrastructure by 2030 with its investments in renewable energy projects across the world. AWS has 86 renewable energy projects in solar and wind farms around Eurpoe, Australia, and the US with a capacity to generate 2,300 MW investments, which are connected to the grids powering AWS data centers. In a press release, Kara Hurst, Vice President of Sustainability, Amazon, said during the launch of new projects, “These new renewable energy projects are part of our roadmap to 80% renewable energy by 2024 and 100% renewable energy by 2030.”
Google: Google has designed highly efficient Tensor Processing Units—which is the AI chips behind Google’s advances in ML—and outfitted all its data centers with high-performance servers. Starting in 2014, Google began using ML to automatically optimize data center cooling and deployed smart temperature, lighting, and cooling controls to further reduce the energy used at its data centers.
By controlling data center cooling with AI-powered recommendation system, Google has achieved consistent energy savings of around 30% on average. The average annual power usage effectiveness for the company’s global fleet of data centers in 2019 was 1.10 compared with the industry average of 1.67—meaning that Google data centers use about six times less overhead energy for every unit of IT equipment energy.