The Massachusetts Institute of Technology (MIT), a premier university of the US, has built a low-power artificial intelligence chip that can be used in smartphones. A team of MIT researchers has built an energy-friendly chip that can perform powerful artificial intelligence (AI) tasks, enabling future mobile devices to implement neural networks. The developed AI chip is energy friendly and is 10 times as efficient as a smartphone GPU (Graphics Processing Unit), that could enable mobile devices to run powerful AI algorithms locally rather than uploading data to the Internet for processing. As we know, GPU is a specialised circuit designed to accelerate the image output in a frame buffer intended for output to a display.
Vivienne Sze, an assistant professor in MIT's department of electrical engineering and computer science, said, "Deep learning is useful for many applications such as object recognition, speech and face detection". The new chip, which the researchers named Eyeriss, can also help users in many other ways by providing the idea that vehicles, appliances, civil-engineering structures, manufacturing equipment, and even livestock would have sensors that report information directly to networked servers, aiding with maintenance and task coordination. With powerful AI algorithms on board, networked devices could make important decisions locally, entrusting only their conclusions, rather than raw personal data, to the Internet.
The MIT researchers presented their findings at the International Solid State Circuits Conference in San Francisco. At the conference, the MIT researchers used Eyeriss to implement a neural network that performs an image recognition task. It was for the first time that a state-of-the-art neural network has been demonstrated on a custom chip.
Mike Polley, senior vice president at Samsung's mobile processor innovations lab., said, "this work is very important, showing how embedded processors for deep learning can provide power and performance optimizations that will bring these complex computations from the cloud to mobile devices".