a logo of artificial intelligence

a logo of artificial intelligence

Artificial intelligence (AI) is the computer programming that enables a machine to do tasks that would ordinarily need a human brain.

Artificial intelligence (AI) is the computer programming that enables a machine to do tasks that would ordinarily need a human brain. On TikTok, for example, AI organizes the postings such that the first ones you view are more likely to be the ones you want. Every Google search yields valuable results thanks to AI. When you ask Siri to play Taylor Swift, AI converts your words into a command to begin playing her music. However, before an AI can accomplish any of these things, developers must train it. And such training consumes a lot of energy. There was a lot of it. Experts are already concerned that the training’s voracious appetite for energy may soon become a significant issue. 

AI Training

The energy used to create AI is derived from the electrical grid. In most regions of the globe, producing energy releases carbon dioxide (CO2) and other greenhouse gases into the atmosphere. 

Researchers frequently aggregate the effects of all greenhouse gases into what they term CO2 equivalents to examine how various activities influence the climate. It emitted an astounding 626,000 pounds of CO2 equivalents. That is the number of greenhouse gases emitted by five American vehicles from when they were manufactured to when they were scrapped. 

That much energy is used only by the largest and most sophisticated models. However, AI models are fast becoming more extensive and more power-hungry. Some AI specialists have raised the alarm about the harm that these energy vampires represent. 

Where The Energy Comes From

The transformer is capable of analyzing text and then translating or summarising it. This model employs a kind of machine learning that is rapidly gaining popularity. However, the system must first practice a process known as training. 

An AI model may churn through millions, if not billions, of translated books and articles to translate between English and Chinese, for example. It learns which words and sentences fit this manner. When presented with fresh material, it can later offer its translation. 

Deep learning allows computers to sift through mounds of data to make rapid, functional, and intelligent judgments. Engineers have created an artificial intelligence (AI) that can control self-driving automobiles and identify emotions in human faces. Other models detect cancer in medical pictures or assist researchers in the discovery of new medicines. This technology is altering the course of history. 

It does, however, come at a cost. 

The finest deep-learning models are the AI world’s behemoths. Training them necessitates massive amounts of computer processing. They practice on computer gear known as graphics processing units (GPUs). They are the same components that power the images of a realistic video game. 

According to Lasse F. Wolff Anthony, it may take hundreds of GPUs operating for weeks or months to train a single AI model. He is a student at ETH Zurich, a technological institution in Switzerland. “The longer [the GPUs] operate, the more energy they use,” he says. 

The majority of AI work nowadays takes place in data centers. These computer-filled buildings account for around 2% of US power use and 1% of worldwide energy consumption. And AI development accounts for a negligible portion of any data center’s burden.