fbpx

AI doesn’t have to be a power hog
Heather Clancy
Thu, 07/30/2020 – 02:15

Plenty of prognostications, including this one from the World Economic Forum, tout the integral role artificial intelligence could play in “saving the planet.”

Indeed, AI is integral to all manner of technologies, ranging from autonomous vehicles to more informed disaster response systems to smart buildings and data collection networks monitoring everything from energy consumption to deforestation.

The flip side to this rosy view is that there are plenty of ethical concerns to consider. What’s more, the climate impact of AI — both in terms of power consumption and all the electronic waste that gadgets create — is a legitimate, growing concern.

Research from the University of Massachusetts Amherst suggests the process of “training” neural networks to make decisions or searching them to find answers uses five times the lifetime emissions of the average U.S. car. Not an insignificant amount.

What does that mean if things continue on their current trajectory?

Right now, data centers use about 2 percent of the world’s electricity. At the current rate of AI adoption — with no changes in the underlying computer server hardware and software — the data centers needed to run those applications could claim 15 percent of that power load, semiconductor firm Applied Materials CEO Gary Dickerson predicted in August 2019. Although progress is being made, he reiterated that warning last week.

At the current rate of AI adoption — with no changes in the underlying computer server hardware and software — the data centers needed to run those applications could claim 15 percent of that power load.

“Customized design will be critical,” he told attendees of a longstanding industry conference, SemiconWest. “New system architectures, new application-specific chip designs, new ways to connect memory and logic, new memories and in-memory compute can all drive significant improvements in compute performance per watt.”

So, what’s being done to “bend the curve,” so to speak?

Technologists from Applied Materials, Arm, Google, Intel, Microsoft and VMware last week shared insights about advances that could help us avoid the most extreme future scenarios, if the businesses investing in AI technologies start thinking differently. While much of the panel (which I helped organize) was highly technical, here are four of my high-level takeaways for those thinking about harnessing AI for climate solutions.

Get acquainted with the concept of “die stacking” in computing hardware design. There is concern that Moore’s Law, the idea that the number of transistors on integrated circuit will double every two years, is slowing down. That’s why more semiconductor engineers are talking up designs that stack multiple chips on top of each other within a system, allowing more processing capability to fit in a given space.

Rob Aitken, a research fellow with microprocessor firm Arm, predicts these designs will show up first in computing infrastructure that couples high-performance processing with very localized memory. “The vertical stacking essentially allows you to get more connectivity bandwidth, and it allows you to get that bandwidth at lower capacitance for lower power use, and also a lower delay, which means improved performance,” he said during the panel.

So, definitely look for far more specialized hardware.

Remember this acronym, MRAM. It stands for magnetic random-access memory, a format that uses far less power in standby mode than existing technologies, which require energy to maintain the “state” of their information and respond quickly to processing requests when they pop up. Among the big-name players eyeing this market: Intel; Micron; Qualcomm; Samsung; and Toshiba. Plenty of R&D power there.

Consider running AI applications in cloud data centers using carbon-free energy. That could mean deferring the processing power needed for certain workloads to times of day when a facility is more likely to be using renewable energy.

“If we were able to run these workloads when we had this excess of green, clean, energy, right now we have these really high compute workloads running clean, which is exactly what we want,” said Samantha Alt, cloud solution architect at Intel. “But what if we take this a step further, and we only had the data center running when this clean energy was available? We have a data center that’s awake when we have this excess amount of green, clean energy, and then asleep when it’s not.”

This is a technique that Google talked up in April, but it’s not yet widely used, and it will require attention to new cooling designs to keep the facilities from running too hot as well as memory components that can respond dynamically when a facility goes in and out of sleep mode.

New system architectures, new application-specific chip designs, new ways to connect memory and logic, new memories and in-memory compute can all drive significant improvements in compute performance per watt.

 

Live on the edge. That could mean using specialized AI-savvy processors in some gadgets or systems you’re trying to make smarter such as automotive systems or smart phones or a building system. Rather than sending all the data to a massive, centralized cloud service, the processing (at least some of it) happens locally. Hey, if energy systems can be distributed, why not data centers?

“We have a lot of potential to move forward, especially when we bring AI to the edge,” said Moe Tanabian, general manager for intelligent devices at Microsoft. “Why is edge important? There are lots of AI-driven tasks and benefits that we derive from AI that are local in nature. You want to know how many people are in a room: people counting. This is very valuable because when the whole HVAC system of the whole building can be more efficient, you can significantly lower the balance of energy consumption in major buildings.”

The point to all this is that getting to a nirvana in which AI can handle many things we’d love it to handle to help with the climate crisis will require some pretty substantial upgrades to the computing infrastructure that underlies it.

The environmental implications of those system overhauls need to be part of data center procurement criteria immediately, and the semiconductor industry needs to step up with the right answers. Intel and AMD have been leading the way, and Applied Materials last week threw down the gauntlet, but more of the industry needs to wake up.

This article first appeared in GreenBiz’s weekly newsletter, VERGE Weekly, running Wednesdays. Subscribe here. Follow me on Twitter: @greentechlady.

Pull Quote
At the current rate of AI adoption — with no changes in the underlying computer server hardware and software — the data centers needed to run those applications could claim 15 percent of that power load.
New system architectures, new application-specific chip designs, new ways to connect memory and logic, new memories and in-memory compute can all drive significant improvements in compute performance per watt.

 

Artificial Intelligence

 

Featured Column
Featured in featured block (1 article with image touted on the front page or elsewhere)
On
Duration
0
Sponsored Article
Off
Data center hallway with servers

Accessibility Toolbar