Artificial intelligence (AI) continues to make breakthroughs every year. In fact, robotics firm Tractica notes that AI is going to be worth over $118.6 billion by 2025. This is a far cry from 2018’s $9.5 billion—implying that, indeed, AI is growing in adoption.
However, while AI continues to grow in adoption, there are still certain kinks in the technology that researchers haven’t found the solution to just yet. One of these major obstacles is AI’s extreme energy usage. Evan Sparks commented on how a simple AI algorithm that can manipulate the piece of a Rubik’s Cube consumes about 2.8 gigawatt-hours of electricity. That’s roughly the output of three nuclear power plants within an hour.
Is an energy-efficient AI possible?
For years, scientists have looked into the possibility of an energy-efficient AI. It’s difficult; after all, an AI uses hundreds, if not thousands, of computers to perform these functions. And those take up huge amounts of energy. Fortunately, 2020 may be the year that pushes this research forward.
In a previous article, we reported about Purdue University’s studies in quantum material. By taking advantage of the quantum’s mechanical effect, manufacturers could, theoretically, offset the energy needed to use this particular technology. It’s still in its theory phase, so it might take a while to build a fully working prototype.
What’s out of the theory box, however, is the new circuitry developed by researchers from the Institute of Industrial Science at the University of Tokyo. They recently found a way to decrease energy usage in supercomputers—by creating “spiraling circuits.” Essentially, spiraling circuits are specialized computer hardware made up of racks of memory modules, and arranged in a three-dimensional spiral.
“For these applications, each layer’s output is typically connected to the next layer’s input,” explains one of the study’s authors, Jixuan Wu. “Our architecture greatly reduces the need for interconnecting wiring.”
When algorithms are trained, many operations need to function in tandem. Because of this, stacking multiple layers of circuits is essential to making the AI process as efficient as possible. However, as more layers are incorporated into the system, PCB stackup impedance needs to be calculated to prevent noise coupling, and EMI issues, among others. The neat thing about a spiraling circuit is that even though it’s stacked, each layer contains a nonvolatile memory chip placed close to the processors. These create the impression of a “spiral” energy flow—ensuring that fewer electrical signals are emitted between every PCB.
Spiraling boards and its applications
If mass-produced, this could be what finally makes AI more accessible. For example, if devices take up less energy, then modern AI-powered technology like Amazon Alexa and smart watches could become more affordable. This is also true for more powerful AI machines like self-driving cars, virtual reality headsets, and robots.
Future smartphones could potentially have longer battery lives too, since most of them are equipped with the components to run AI apps like Waze and Uber.
All in all, the concept of more energy-conserving AI is a novelty not only for manufacturers, but consumers as well. With the University of Tokyo’s physical proof of its existence, we can only hope to see more of its applications in the near future.
Leave a Reply