From No Tech Magazine, March 18:
The energy consumption of a single training run of the latest (by 2020) deep neural networks dedicated to natural language processing exceeds 1,000 megawatt-hours (more than a month of computation on today’s most powerful clusters). This corresponds to an electricity bill of more than 100,000 euros (figures in the millions of euros are sometimes found) and 500 tons of CO2 emissions – that is, the carbon footprint equivalent to 500 transatlantic round trips from Paris to New York. In comparison, the human brain consumes in a month about 12 kWh, i.e., a hundred thousand times less, for tasks much more complex than natural language translation.
Unlike a mere ten years ago and in spite of the improvement in desktop computer capabilities, it is no longer possible today to train a modern neural network on a personal computer (it would theoretically take up to 405 years)… One may object that it is probably not surprising that deep learning algorithms be far less energy efficient than three billion years of biological evolution and that the figures may rather suggest a huge room for potential improvement… This objection would displace the focus of the point made here: in a matter of ten years, the absolute consumption of AI learning skyrocketed to reach levels of hundreds of tons of equivalent CO2 for a single learning task. These levels are at stunning odds with the requirements for the human society to drastically reduce its carbon footprint at a rate of −7%/year, starting today.
Let us pursue on the objection line of argument: “once trained”, the algorithm can be reused billions of times....
....MUCH MORE