Title | : | Resource-efficient ML in 2 KB RAM for the Internet of Things |
Speaker | : | Prateek Jain (MSR & IIT Kanpur) |
Details | : | Thu, 21 Dec, 2017 11:00 AM @ CSB 36 |
Abstract: | : | We propose an alternative paradigm for the Internet of Things (IoT) where machine learning algorithms run locally on severely resource-constrained edge and endpoint devices without necessarily needing cloud connectivity. This enables many scenarios beyond the pale of the traditional paradigm including low-latency brain implants, precision agriculture on disconnected farms, privacy-preserving smart spectacles, etc. Towards this end, we develop novel tree and kNN based algorithm, called Bonsai and ProtoNN, for efficient prediction on IoT devices -- such as those based on the Arduino Uno board having an 8 bit ATmega328P microcontroller operating at 16 MHz with no native floating point support, 2 KB RAM and 32 KB read-only flash memory. Bonsai and ProtoNN maintain prediction accuracy while minimizing model size and prediction costs by: (a) developing novel compressed yet expressive models; (b) sparsely projecting all data into a low-dimensional space in which the models are learnt; and (c) jointly learning all model and projection parameters. Experimental results on multiple benchmark datasets demonstrate that Bonsai and ProtoNN can make predictions in milliseconds even on slow microcontrollers, can fit in KB of memory, have lower battery consumption than all other algorithms while achieving prediction accuracies that can be as much as 30% higher than state-of-the-art methods for resource-efficient machine learning. Bonsai and ProtoNN are also shown to generalize to other resource-constrained settings beyond IoT by generating significantly better search results as compared to Bing's L3 ranker when the model size is restricted to 300 bytes. |