Revolutionizing on-device training: Achieving artificial intelligence with less than a quarter of a megabyte of memory

A new technique allows on-device learning to be done with less than a quarter megabytes of memory

Microcontrollers are tiny computers that run simple commands. They’re the foundation for millions of connected devices – from internet-of things (IoT), to automobile sensors. Microcontrollers with low power and limited memory have no operating system. This makes it difficult to train artificially intelligent models on \”edge\” devices that are independent of central computing resources.

A machine-learning model can be trained on an edge device to make it adaptable to new data. It will also improve predictions. A smart keyboard, for example, could learn continuously from the writing of the user by training a machine-learning model. The training process is so memory-intensive that it’s usually done on powerful computers in a datacenter, before the model can be deployed. It is also more expensive and can raise privacy concerns since the data of users must be sent to central servers.

Researchers at MIT Watson AI Lab and MIT have developed a technique to address this issue. This technique allows on-device learning using less than a quarter-megabyte of memory. Other training solutions for connected devices use memory of more than 500 Megabytes, which is much larger than the 256 Kilobytes that most microcontrollers have. (There are 1,024 Kilobytes per megabyte).

Source:
https://techxplore.com/news/2022-10-technique-enables-on-device-quarter-megabyte.html