artificial intelligence sustainability

Green AI: how AI poses both problems and solutions regarding climate change

AI and ML are making a significant contribution to climate change. Developers can help reverse the trend with best practices and tools to measure carbon efficiency.

The growth of computationally intensive technologies such as machine learning incurs a high carbon footprint and is contributing to climate change. Alongside that rapid growth is an expanding portfolio of green AI tools and techniques to help offset carbon usage and provide a more sustainable path forward.

The cost to the environment is high, according to research published last month by Microsoft and the Allen Institute for AI, with co-authors from Hebrew University, Carnegie Mellon University and Hugging Face, an AI community. The study extrapolated data to show that one training instance for a single 6 billion parameter transformer ML model -- a large language model -- is the CO2 equivalent to burning all the coal in a large railroad car, according to Will Buchanan, product manager for Azure machine learning at Microsoft, Green Software Foundation member and co-author of the study.

In the past, code was optimized in embedded systems that are constrained by limited resources such as those seen in phones, refrigerators or satellites, said Abhijit Sunil, analyst at Forrester Research. However, emerging technologies such as AI and ML aren't subject to those limitations, he said.

"When we have seemingly unlimited resources, what took precedence was to make as much code as possible," Sunil said.

Is AI the right tool for the job?

Green AI, or the process of making AI development more sustainable, is emerging as a possible solution to the problem of power-hungry algorithms. "It is all about reducing the hidden costs of the development of the technology itself," Buchanan said.

A starting point for any developer is to ask if AI is the right tool for the job and to be clear on why machine learning is being deployed in the first place, said Abhishek Gupta, founder and principal researcher at the Montreal AI Ethics Institute and chair of the Green Software Foundation's standards working group.

"You don't always need machine learning to solve a problem," Gupta said.

Developers should also consider conducting a cost-benefit analysis when deploying ML, Gupta said. For example, if the use of ML increases a platform's satisfaction rate from 95% to 96%, that might not be worth the additional cost to the environment, he said.

Choose a carbon-friendly region

Once a developer has decided to use AI, then choosing to deploy a model in a carbon-friendly region can have the largest effect on operational emissions, reducing the Software Carbon Intensity rate by about 75%, Buchanan said.

"It's the most impactful lever that any developer today can use," Buchanan said.

Gupta provided the following example: Instead of running a job in the Midwestern U.S., where electricity is primarily obtained from fossil fuels, developers can choose to run it in Quebec, which garners more than 90% of its electricity from hydro.

Companies will also have to consider other factors beyond energy type when deciding where an ML job should run. In April 2021, Google Cloud introduced its green region picker, which helps companies evaluate costs, latency and carbon footprint when choosing where to operate. But tools like these aren't readily available from all cloud providers, Buchanan said.

To address the issue, the Green Software Foundation is working on a new tool called Carbon Aware SDK, which will recommend the best region to spin up resources, he said. An alpha version should be available within the next couple of months.

Other ways to be green

If the only available computer is in a dirty electricity region, developers could use a federated learning-style deployment where training happens in a distributed fashion across all devices that exist in an electricity regime, Gupta said. But federated learning might not work for all workloads, such as those that must adhere to legal privacy considerations.

Another option is for developers to use tinyML, which shrinks machine learning models through quantization, knowledge distillation and other approaches, Gupta said. The goal is to minimize the models so that they can be deployed in a more resource-efficient way, such as on edge devices, he said. But as these models deliver limited intelligence, they might not be suited for complex use cases.

Sparse and shallow trees -- tree-based models partitioned into a small number of regions with sparse features -- can also provide the same results at less cost, Buchanan said. Developers can easily define them with a set of parameters when choosing a neural net architecture, he said.

"There's an industrywide trend to think that bigger is always better, but our research is showing that you can push back on that and say specifically that you need to choose the right tool for the job," Buchanan said.

Consumption metrics could be the solution

The Green Software Foundation and other initiatives are making progress toward measuring and mitigating software's carbon footprint, Buchanan said.

For example, Microsoft made energy consumption metrics available last year within Azure Machine Learning, making it possible for developers to pinpoint their most energy-consuming jobs. The metrics are focused on power-hungry GPUs, which are faster than CPUs but can consume more than 10 times the energy. Often used for running AI models, GPUs are typically the biggest culprit when it comes to power consumption, Buchanan said.

However, what's still needed is more interoperable tooling, Buchanan said, referring to the piecemeal green AI tools that are currently available. "The Green Software Foundation is doing one piece," he said, "but I think cloud providers need to make concerted investments to become more energy efficient."

Ultimately, the goal is to trigger behavior change so that green AI practices become the norm, according to Gupta. "We're not just doing this for accounting purposes," he said.

Author: Stephanie Glen

Source: TechTarget