Toronto tech firm aims to reduce energy use in AI

As the use of artificial intelligence increases, the amount of energy required to power it will also go up exponentially. Dilshad Burman speaks with a local tech leader about managing its environmental impact.

By Dilshad Burman

Artificial Intelligence (AI) is already a part of our everyday lives, with uses in the automotive industry to streaming platforms, and with applications like ChatGPT, it’s become even more mainstream. But for all its advantages, AI also has a marked impact on the environment.

Arun Iyengar, CEO of Toronto-based tech firm Untethered AI, explains that traditional AI methods are a huge drain on the world’s energy resources.

“Just looking at January, if you look at the number of people that used ChatGPT and the amount of energy it took to serve them, it would be the same amount of energy to fully serve a town of 175,000 people,” he says.

That’s because it’s using methods established as far back as the 1940s by John von Neumann, widely considered the father of modern AI technology, which has powered all computer chips to date.

“Anytime you move data you burn energy. So the more distance you move data, the more power you’re burning to make that happen. And that’s why today’s implementations end up consuming so much power.”

Iyengar’s company, a semiconductor startup, has created a first-of-its-kind chip that aims to dramatically reduce AI’s energy use.

“We are basically putting the processing and the memory right next to each other. So the data movement is now contained in a very, very short distance … that reduces the power or the energy required to do Artificial Intelligence by a factor of six to 10 times,” he says.

“We design the chips, we get them manufactured, and we provide that to our large customers who can then implement it into either their servers if it’s in a ChatGPT type environment for example, or into an autonomous vehicle.”

He explains it can also boost what and how much companies can do with AI.

“Ninety per cent of the energy in the traditional approach was wasted for moving data. We just re-harnessed that 90 per cent of energy to actually do processing with it,” he says.

“So as an example, we’ve been working with General Motors. In a car-type environment, you’re limited by how much power you have. So you are limited now by how much processing you can actually do with AI. And so the more [processing capability] we can provide for the same amount of power, means that there’s a lot more that [companies] can deploy,” he says.

In other words, you can get a lot more bells and whistles with no increase in the amount of energy used.

But why should ordinary Canadians be invested in these technical aspects of AI?

Iyengar says as Artificial Intelligence becomes more widespread, the amount of power it uses will eventually affect us all.

“You’ve got to be very responsible with how you actually power it up. Today, about 1 per cent of the world’s energy goes to servers [used for AI]. With continued AI adoption, that’s going to be about 15 per cent [by the end of the decade]. Where’s that going to come from? Of course, we can’t tell a family that’s in a cold part of the country ‘Sorry, you’re not going to get power today because I’m powering ChatGPT over there,’” he jokes.

Iyengar adds that big tech companies have an important role to play in reducing AI’s environmental impact.

“Large tech companies today, what they’re trying to do is get the product out to the marketplace,” he says.

“When you start deploying it in a very large scale, then it’s important and it’s incumbent upon the large companies to have a clear view of the environmental impact. Not just say, ‘Hey look, I’m just going to deploy it because that’s how I know how to deploy it, but let me deploy it in a way that’s sustainable.'”

Top Stories

Top Stories

Most Watched Today