We’re getting a better idea of AI’s true carbon footprint 

To test its new approach, Hugging Face estimated the overall emissions for its own large language model, FLOWER, was released earlier this year. It’s a process that involves adding many different numbers: the amount of energy used to train the model on the supercomputer, the energy required to manufacture the supercomputer’s hardware and maintain the facility. Its computing infrastructure and energy is used to run BLOOM after it has been deployed. The researchers calculated that last part using a software tool called CodeCarbon, which tracks the carbon emissions BLOOM produces in real time over an 18-day period.

Hugging Face estimates that BLOOM’s training resulted in 25 tons of carbon emissions. However, the researchers found that number doubled when they took into account emissions caused by manufacturing the computer equipment used for training, the broader computing infrastructure, and the energy required. needed to actually run BLOOM after it has been trained.

While that may seem like a lot for a model—50 tons of carbon emissions equates to about 60 flights between London and New York—it is significantly less than the emissions associated with other LLMs of the same size. This is because BLOOM is trained on a French supercomputer, most of which runs on nuclear power, which produces no carbon emissions. Models trained in China, Australia or parts of the United States, whose energy grids are heavily dependent on fossil fuels, are likely to cause more pollution.

After BLOOM was launched, Hugging Face estimated that using this model emitted about 19 kg of carbon dioxide per day, which is equivalent to the emissions caused by driving about 54 miles in a single day. average new car.

By comparison, OpenAI’s GPT-3 and Meta OPT are estimated to emit more than 500 and 75 tons of carbon dioxide during training, respectively. The GPT-3’s massive emissions can be partly explained by the fact that it trained on older, less efficient hardware. But it’s hard to say for sure what these numbers are; There is no standardized way to measure carbon emissions, and these figures are based on external estimates or, in Meta’s case, the limited data the company has published.

“Our goal is to go beyond the carbon footprint of electricity consumed during training and take a larger share of the lifecycle to help the AI ​​community better understand their impact on the environment. We can start to reduce it, says Sasha Luccioni, a researcher at Hugging Face and lead author of the paper.

The Hugging Face paper sets a new standard for organizations developing AI models, said Emma Strubell, assistant professor in the computer science school at Carnegie Mellon University who wrote a seminal paper. . paper on the impact of AI on climate in 2019. She was not involved in this new study.

The paper “presents the most thorough, honest, and informed analysis of the carbon footprint of a large ML model to date as far as I know, going into far more detail… than any other paper. other [or] reports that I know of,” Strubell said.


News5h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button