Science

The system brings deep learning to IoT devices

Credit: Pixabay / CC0 Public Domain

Deep learning is everywhere. This branch of artificial intelligence organizes your social media and presents your Google search results. Soon, deep learning can also check your vital elements or adjust a thermostat. MIT researchers have developed a system that can bring deep learning neural networks into new – and much smaller – places such as tiny computer chips in medical wearables, home appliances, and the other 250 billion objects that make up the “Internet of Things” (Internet of Things).

The system, called MCUNet, designs compact neural networks that provide unprecedented speed and accuracy for On IoT devices, though, memory and processing power are limited. This technology could facilitate the expansion of the IoT world while saving energy and improving data security.

The Internet of things

The Internet of Things was born in the early 1980s. Carnegie Mellon graduate students, including Mike Kazar 78, have connected a Cola-Cola machine to the Internet. The group’s motivation was simple: laziness. They wanted to use their computers to make sure the device was in storage before traveling from their offices to make a purchase. It was the world’s first internet connected device. “It was pretty much taken as a joke,” says Kazar, who is now an engineer at Microsoft. “Nobody expects billions of devices on the Internet.”

Since becoming that Coca-Cola machine, everyday things have become increasingly interconnected in the growing Internet of Things. This includes everything from wearable heart monitors to smart refrigerators that tell you when your milk is low. IoT devices often run on microcontrollers – simple computer chips without an operating system, little processing power, and less than a thousand normal smartphone memory. Therefore pattern recognition tasks like deep learning are difficult to run locally on IoT devices. For complex analysis, the data collected from the Internet of Things is often sent to the cloud, making it vulnerable to piracy.

How do we deploy neural networks directly to these small devices? It’s a new area of ​​research that is heating up, Han says. Companies like Google and ARM are working in this direction. Han too.

Using MCUNet, Han’s group has designed two essential components for “micro-deep learning” – operating neural networks on microcontrollers. One of the components is TinyEngine, an inference engine that directs resource management, similar to the operating system. TinyEngine is optimized to run a specific neural network architecture, which is identified by another MCUNet component: TinyNAS, which is a Neural Architecture search algorithm.

Joint design of the system algorithm

It is not easy to design a deep network of microcontrollers. Research techniques in current neural architecture begin with a wide range of potential network structures based on a predefined template, and then gradually find those with high resolution and low cost. While the method works, it is not the most efficient. “It can work well with GPUs or smartphones,” Lin says. “But it was difficult to apply these techniques directly to very fine microcontrollers, because they are so small.”

So Lin developed TinyNAS, which is a neural architecture search method that creates custom-sized networks. “We have a lot of microcontrollers that come with different power capacities and different memory sizes,” Lin says. So we developed the algorithm [TinyNAS] To optimize the search area for various microcontrollers. “ The ad hoc nature of TinyNAS means it can generate stress With the best possible performance – With no unnecessary parameters. “Then we deliver the final active model to the microcontroller,” Lin says.






Credit: Massachusetts Institute of Technology

To operate that tiny neural network, the microcontroller also needs a weak inference engine. A typical heuristics engine carries some heavy weight – instructions for tasks that are rarely triggered. The extra code is no problem for a laptop or smartphone, but it could easily confuse the microcontroller. “It has no off-chip memory, nor a disk,” says Han. “Everything that collects is only one megabyte of flash memory, so we have to carefully manage this little resource.” TinyEngine cue.

The researchers jointly developed the inference engine with TinyNAS. TinyEngine creates the basic code needed to run TinyNAS ‘custom neural network. Any extreme hypothesis code is ignored, which reduces translation time. “We only keep what we need,” says Han. “Since we designed the neural network, we know exactly what we need. This is the feature of designing the system’s algorithm codes.” In group tests for TinyEngine, the translated binary code size was between 1.9 and 5 times smaller than Google and ARM’s microcontroller inference engines. TinyEngine also has innovations that reduce uptime, including deep in-place torsion, which nearly halves peak memory usage. After signing the TinyNAS and TinyEngine code, Han MCUNet’s team put it to the test.

MCUNet’s first challenge was image classification. Researchers used the ImageNet database to train the system with tagged images, and then to test its ability to classify the new images. On a commercial microcontroller they tested, MCUNet successfully rated 70.7 percent of the new images – the previous modern neural network and inference engine assembly were only 54 percent accurate. “Even a 1 percent improvement is significant,” says Lin. “So that’s a giant leap for the microcontroller setup.”

The team found similar results in ImageNet tests of three other controllers. In both speed and accuracy, MCUNet beat the competition for audio and visual “wake-up words” tasks, as the user initiates interaction with the computer using audio cues (think: “Hey, Siri”) or simply by entering a room. Experiments highlight MCUNet’s ability to adapt to many applications.

‘Enormous potential’

The promising test results give Han the hope of becoming the new industrial standard for microcontrollers. “It has enormous potential,” he says.

Kurt Kotzer, a computer scientist at the University of California, Berkeley, who was not involved in the work, says the progress “extends the boundaries of deep neural network design even further in the computational field of small, energy-efficient microcontrollers.” He adds that MCUNet could “bring intelligent computer vision capabilities to even simple kitchen gadgets, or enable smarter motion sensors.”

MCUNet can make IoT devices more secure. “The main advantage is maintaining privacy,” says Han. “You don’t need to transfer data to the cloud.”

Analyzing data locally reduces the risk of personal information being stolen – including personal health data. They have visualizations of smartwatches with MCUNet that not only sense users’ heartbeats, blood pressure and oxygen levels, but also analyze this information and help them understand it. MCUNet can also provide deep learning for IoT devices in vehicles and rural areas with limited internet access.

Additionally, MCUNet’s thin computing footprint translates into a minuscule carbon footprint. “Our big dream is green AI,” Han says, adding that a great neurological training They could burn carbon equivalent to the lifetime of five cars. The MCUNet on the microcontroller will require a small fraction of that power. “Our ultimate goal is to enable small and effective AI with fewer computational resources, fewer human resources and less data,” says Han.


Neural network for low-memory IoT devices


more information:

MCUNet: Small Deep Learning on Internet of Things devices. arXiv: 2007.10319 [cs.CV] arxiv.org/abs/2007.10319

This story is republished with permission from MIT News (web.mit.edu/newsoffice/), A popular site covering news about MIT research, innovation, and teaching.

the quote:
The system brings deep learning to IoT devices (2020, November 13)
Retrieved November 13, 2020
From https://techxplore.com/news/2020-11-deep-internet-devices.html

This document is subject to copyright. Notwithstanding any just treatment for the purpose of private study or research, no
Part may be reproduced without written permission. The content is provided for informational purposes only.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button