The immense electricity needs of AI computing were flagged early on as a bottleneck, prompting Alphabet’s Google Cloud to plan for how to source energy and how to use it, according to Google Cloud CEO Thomas Kurian.
Speaking at Fortune Brainstorm AI in San Francisco on Monday, he
pointed out that the company has been working on AI since well before large language models came along and took the long view.
“We also knew that the most problematic thing that was going to happen was going to be energy, because energy and data centers were going to become a bottleneck alongside chips,” Kurian told
Fortune’s Andrew Nusca. “So we designed our machines to be super efficient.”
The International Energy Agency has estimated that some AI-focused data centers consume as much electricity as 100,000 homes, and some of the largest facilities under construction could even use 20 times that amount.
At Brainstorm, Kurian laid out Google Cloud’s three-pronged approach to ensuring there will be enough energy to meet all that demand.
First, the company seeks to be as diversified as possible in the kinds of energy that power AI computation.
Second, an important part of Google Cloud’s strategy is being as efficient as possible, including how it reuses energy within data centers.
And third, Google Cloud is working on “some new fundamental technologies to actually create energy in new forms,” Kurian said. He didn’t elaborate further.
—Jason Ma