As companies like Microsoft, who recently unveiled their ChatGPT-powered Bing search engine, usher in an era of ubiquitous AI, there’s certainly no shortage of ethical and legal issues to be raised. But there is another worrying aspect of the technology that has received much less attention: its environmental impact, Wired reports.
“Huge resources are already required to index and search internet content, but incorporating AI requires a different kind of firepower,” Alan Woodward, cybersecurity expert at the University of Surrey, told the magazine.
“It requires processing power as well as storage space and efficient searching. Every time we see a step in online processing, we see a significant increase in power and cooling resources required by large processing centers,” he added. “I think that could be such a step.”
The computer scientist Carlos Gómez-Rodríguez from the University of Coruña told Wired that training the Large Language Models (LLMs) that ChatGPT runs on is so prohibitively resource-intensive that essentially “only the big tech companies can train them”.
Unfortunately, none of them – Microsoft and Google in particular – have publicly disclosed how much processing power they use to get their chatty AIs off the ground. But an independent analysis cited by Wired found that training OpenAI’s GPT-3 model (which runs ChatGPT) consumed 1,287 megawatt-hours, which the outlet says is “the same amount as a single person making 550 round-trip trips between New York and San Francisco.” “, compared.
Not per se The bad. But keep in mind that “you don’t just have to train it,” says Gómez-Rodríguez, “you have to run it and serve millions of users.”
Or billionindeed, now that Bing and Google are preparing to roll out the technology to their global user bases.
But let’s be fair. How efficient is an AI search compared to the old-fashioned search engines we use now?
“At least four to five times more computing power per search,” estimates Martin Bouchard, co-founder of the sustainable data center company QScale Wired.
In terms of magnitude, that could be very significant. Bouchard notes that the computational demands will only get worse once the AIs are eventually trained on current data, rather than just snapshot datasets.
“Current data centers and the existing infrastructure will not withstand this [the race of generative AI]” he added. “It’s too much.”
More on generative AI: Microsoft appears to be discussing ChatGPT’s bizarre alternate personality