Sources
1 Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., & Dean, J. (2021). Carbon emissions and large neural network training. arXiv.
2 Mehta, S. (2024, July 4). How much energy do llms consume? Unveiling the power behind AI. Association of Data Scientists.
3 de Vries, A. (2023). The growing energy footprint of Artificial Intelligence. Joule, 7(10), 2191–2194. doi:10.1016/j.joule.2023.09.004
4 de Vries, A. (2023). The growing energy footprint of Artificial Intelligence. Joule, 7(10), 2191–2194. doi:10.1016/j.joule.2023.09.004
5 Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for Deep Learning in NLP. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
6 Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for Deep Learning in NLP. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
7 CottGroup. (2024). Smaller and more efficient artificial intelligence models: Cottgroup.