I recently came across this article on ChatGPT's electricity consumption and I was shocked by the findings. According to the author Kasper Groes Albin Ludvigsen, ChatGPT consumed in one month, January, up to 23,364,000 KWh: the annual energy spent respectively by 2000 households in an entire year!
This raises serious questions about the environmental sustainability of NLP and deep learning in general. Do we have the resources to keep pace with current demand?
"[...] at the low end of the estimated range for ChatGPT’s electricity consumption in January 2023, training GPT-3 and running ChatGPT for one month took roughly the same amount of energy. At the high end of the range, running ChatGPT took 18 times the energy it took to train."
I believe that one of the solutions is to invest in tailor-made processors that are optimized for NLP and deep learning. These processors can reduce the energy consumption and cost of training and running these models, while maintaining or even improving their performance and accuracy. By doing so, we can democratize AI tools, preserve the planet and step all together to this new era of technology.
You can read more at:
#ai #cloud #technology #sustainability #energy #nlp