MIT researchers enable local compute on edge devices using MCUs

Two research groups at MIT have discovered a way of reducing memory usage on edge devices using microcontroller units (MCUs). This could open the door to AI tasks such as human identification being run on a greater number of inexpensive devices rather than handed off to a remote server or public cloud, opening up a whole new range of edge use cases.

The amount of processing power and memory available on MCUs is tiny — typically 256k of RAM and 1MB of storage, compared to a smartphone with 256GB RAM and terabytes of storage. So for microcontroller units, memory is a precious resource. Two MIT research groups at MIT-IBM Watson AI and the Computer Science department had the idea of profiling MCU memory usage of neural networks, and they found an imbalance in memory usage which triggered a bottleneck.

An example of an MCU implementation.
Source: MIT

With a new neural network architecture design, the teams were able to reduce memory usage at peak times by four to eight times. They then combined this with their human detection system MCUNetV2 and found it outperformed other MCUs available. This they argue will open the doors to new video recognition applications that were not possible before.

This new design of TinyML is quicker and cheaper (as IoT devices cost $1 or $2) than deep learning that is undertaken on remote servers using data gathered from sensors. There is also the advantage of being able to keep the data local rather than handed off to a public cloud for additional security.

“We really push forward for these larger-scale, real-world applications,” says Song Han, assistant professor at MIT. “Without GPUs or any specialized hardware, our technique is so tiny it can run on these small cheap IoT devices and perform real-world applications like these visual wake words, face mask detection, and person detection. This opens the door for a brand-new way of doing tiny AI and mobile vision.”

As the researchers say, it’s an important step down the path of bringing AI to the mass market.

Article Topics

 |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sponsored Links

Avassa: Empowers companies to bridge the gap between modern containerized applications development and operations and distributed edge infrastructure. https://avassa.io/

DataBank: We believe there is a different edge to be served - the “middle edge" - that will become the first step for many in their journey to the edge. https://www.databank.com/

Latitude.sh: Where the power of bare metal meets the flexibility of the cloud. Deploy physical servers across 23 global locations in as little as 5 seconds. https://www.latitude.sh/

Zenlayer: A massively distributed edge cloud service provider operating over 270 PoPs around the world, with expertise in fast-growing emerging markets. https://www.zenlayer.com/

OnLogic: A global industrial PC manufacturer and solution provider focused on hardware for IoT and edge AI, OnLogic designs highly-configurable computers engineered for reliability. https://www.onlogic.com/

Featured Company

Latest News