AI Impact On Climate Change
AI Impact On Climate Change
✅ Research Focus
Investigating ChatGPT’s energy consumption
and environmental impact.
Exploring sustainable AI alternatives for
lower carbon emissions.
LIFECYCLE OF AI ENERGY CONSUMPTION
1. Training Phase
AI models require extensive computational power during training.
Massive datasets and complex neural networks increase electricity consumption.
Large models like GPT-4 require weeks of processing on thousands of GPUs.
🔴 Impact: Can generate hundreds of metric tons of CO₂ per training cycle.
2. Inference Phase
Once trained, AI models like ChatGPT process billions of queries daily.
Each interaction requires real-time computing, consuming energy continuously.
🔴 Impact: AI usage contributes daily emissions, scaling with user demand.
3. Data Storage & Cloud Dependence
AI relies on global data centers to store and process information.
These data centers operate 24/7, consuming vast amounts of power and cooling resources.
🔴 Impact: Data centers account for ~1% of global electricity use and growing.
LIFECYCLE OF AI ENERGY CONSUMPTION
Energy High (due to GPU-intensive Moderate (depends on project Energy per Image ~0.02 - 0.05 kWh ~0.2 - 1.0 kWh
Consumption model inference) complexity)
CO₂ Emissions per
~10-50g CO₂ ~100-500g CO₂
CO₂ Emissions Estimated 2-5x higher than Lower, but increases with high-resolution Image
per Image manual design rendering
Water Usage (for Minimal (local ~2-5 liters (data
cooling) machine) centers)
Computational Requires cloud-based AI models, Runs on local machines or smaller cloud-
Requirements data centers based servers
Hardware Power ~2,000-3,000W (AI
~100-300W (PC)
Usage server)
Large-scale GPUs (e.g., NVIDIA
Hardware Usage Consumer GPUs (RTX 4090, M1 Ultra)
A100, H100)
Water
Consumption High (for data center cooling) Lower, mostly air-cooled workstations
(Cooling)
Sustainability Some providers use renewable Depends on user’s hardware and energy
Measures energy source
SUSTAINABLE AI AND FUTURE DIRECTIONS
Energy-Efficient AI Models
Smaller, task-specific AI models
Model pruning & quantization to reduce computation
Low-power AI chips (e.g., ARM-based, neuromorphic
computing)
Policy Recommendations
Regulations on AI energy use and efficiency
Carbon footprint transparency in AI development
Incentives for green AI research
ALTERNATIVE AI SOLUTIONS FOR SUSTAINABILITY
Green AI Innovations
AI models designed for minimal energy consumption
Open-source lightweight AI frameworks (e.g., TinyML)
REFERENCE
Bolón-Canedo, V., Morán-Fernández, L., Cancela, B., & Alonso-Betanzos, A. (2024). A review of green artificial intelligence:
Towards a more sustainable future. Neurocomputing, 570, 126104. https://doi.org/10.1016/j.neucom.2024.126104
Verdecchia, R., Sallou, J., & Cruz, L. (2023). A systematic review of Green AI. arXiv preprint arXiv:2301.11047.
https://arxiv.org/abs/2301.11047
van Wynsberghe, A. (2021). Sustainable AI: AI for sustainability and the sustainability of AI. AI and Ethics, 1(3), 213–218.
https://doi.org/10.1007/s43681-021-00043-6
Yokoyama, A. M., Ferro, M., de Paula, F. B., Vieira, V. G., & Schulze, B. (2023). Investigating hardware and software aspects in the
energy consumption of machine learning: A Green AI‐centric analysis. Concurrency and Computation: Practice and Experience,
35(19), e7825. https://doi.org/10.1002/cpe.7825
Hao, K., Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. Proceedings
of the 57th Annual Meeting of the Association for Computational Linguistics, 3645–3650. https://doi.org/10.18653/v1/P19-1355
Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., & Dean, J. (2021). Carbon emissions and large neural
network training. arXiv preprint arXiv:2104.10350. https://arxiv.org/abs/2104.10350