SoulMete - Informative Stories from Heart. Read the informative collection of real stories about Lifestyle, Business, Technology, Fashion, and Health.

These simple changes can make AI research much more energy efficient

[ad_1]

Since the first paper studying this technology’s impact on the environment was published three years ago, a movement has grown among researchers to self-report the energy consumed and emissions generated from their work. Having accurate numbers is an important step toward making changes, but actually gathering those numbers can be a challenge.

“You can’t improve what you can’t measure,” says Jesse Dodge, a research scientist at the Allen Institute for AI in Seattle. “The first step for us, if we want to make progress on reducing emissions, is we have to get a good measurement.”

To that end, the Allen Institute recently collaborated with Microsoft, the AI company Hugging Face, and three universities to create a tool that measures the electricity usage of any machine-learning program that runs on Azure, Microsoft’s cloud service. With it, Azure users building new models can view the total electricity consumed by graphics processing units (GPUs)—computer chips specialized for running calculations in parallel—during every phase of their project, from selecting a model to training it and putting it to use. It’s the first major cloud provider to give users access to information about the energy impact of their machine-learning programs. 

While tools already exist that measure energy use and emissions from machine-learning algorithms running on local servers, those tools don’t work when researchers use cloud services provided by companies like Microsoft, Amazon, and Google. Those services don’t give users direct visibility into the GPU, CPU, and memory resources their activities consume—and the existing tools, like Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon, need those values in order to provide accurate estimates.

The new Azure tool, which debuted in October, currently reports energy use, not emissions. So Dodge and other researchers figured out how to map energy use to emissions, and they presented a companion paper on that work at FAccT, a major computer science conference, in late June. Researchers used a service called Watttime to estimate emissions based on the zip codes of cloud servers running 11 machine-learning models.

They found that emissions can be significantly reduced if researchers use servers in specific geographic locations and at certain times of day. Emissions from training small machine-learning models can be reduced up to 80% if the training starts at times when more renewable electricity is available on the grid, while emissions from large models can be reduced over 20% if the training work is paused when renewable electricity is scarce and restarted when it’s more plentiful. 

[ad_2]
Source link