Skip to content

Guidelines for Reducing the Carbon Footprint of Generative Artificial Intelligence

Minimize energy usage with generative models: Opt for existing ones over creating new ones. Fine-tune instead of starting over. This approach offers speed and reduced energy consumption. Proceed to...

Cut down on energy consumption when using generative models: Reuse rather than create new ones,...
Cut down on energy consumption when using generative models: Reuse rather than create new ones, saving significant energy. Finally, tweak existing models to fit your needs rather than training entirely new ones, a quicker and less energy-draining option.

Guidelines for Reducing the Carbon Footprint of Generative Artificial Intelligence

The environmental impact of artificial intelligence, particularly generative AI like GPT-3, is becoming a significant concern as more companies adopt this technology. While the potential benefits are immense, the energy consumption and corresponding greenhouse gas emissions of these systems, particularly during the training phase, pose a threat to our environment.

Data centers, which house the equipment necessary to run AI models, are a major contributor to this problem. These centers require substantial amounts of energy and water to operate optimally, leading to significant carbon emissions.

Understanding the carbon footprint of these models is crucial. Research shows that the carbon footprint of AI models can be influenced by three factors: the energy used during training, the energy used to run queries, and the energy required for computing hardware. Models with more parameters and data require more energy, resulting in a larger carbon footprint.

Given the energy-intensive nature of generative AI, change is necessary. Both vendors and users must make efforts to reduce the carbon footprint of these systems.

Firstly, creating new models should be avoided whenever possible. Existing large language models can often meet a company's needs. Refining existing models with a company's proprietary data can also provide better value than creating a new model from scratch.

Secondly, energy-conservative computation models like TinyML can be used. These models allow users to process data locally on low-powered devices, reducing the need for energy-intensive servers.

Thirdly, large models should only be used when they are absolutely necessary. The potential value they provide should justify their energy demands.

Lastly, data centers should be located in regions with environmentally friendly energy sources. Cloud providers who have deployed their data centers in carbon-neutral or zero-carbon resources should be chosen.

In addition to these strategies, reusing models and resources, evaluating energy sources, and including AI activity in carbon monitoring can also help reduce the carbon footprint of AI research units.

As we continue to embrace generative AI, it is essential to do so responsibly. The preservation of our planet should be a priority, and the environmental impact of our technological choices must be carefully considered.

"How to Make Generative AI Greener," Ajay Kumar and Tom Davenport, Harvard Business Review, July 20, 2023.

Machine learning, a key aspect of artificial intelligence, is contributing to climate-change through its energy consumption in data centers, particularly during the training phase of generative AI models like GPT-3. To reduce this environmental impact, efforts can be made to avoid creating new models, use energy-conservative computation models like TinyML, and only use large models when necessary. Additionally, data centers should be located in regions with environmentally friendly energy sources, and reusing models and resources, evaluating energy sources, and including AI activity in carbon monitoring are also effective tactics for a greener AI.

Read also:

    Latest