One of the machine learning researchers I follow closely is @petewarden, who writes extensively about the hardware side of ML, something we hear very little about beyond vague accounts of the power-consumption and carbon footprints of cloud-based GPUs.
1/
1/
Last month, Warden presented a talk predicting that there would be " tens or hundreds of billions of [embedded ML] devices over the next few years."
https://docs.google.com/presentation/d/1yV0rLchNUvlLeVdNCOUvStxCSBlwbzx6rvaiiOgnpYE/edit
2/
https://docs.google.com/presentation/d/1yV0rLchNUvlLeVdNCOUvStxCSBlwbzx6rvaiiOgnpYE/edit
2/
In a followup, Warden explains: the rise of Tinyml (a machine learning framework for low-powered, embedded processors) and the trends in hardware point to a near-future scenario where a $0.50 CPU replicates today's high-power, networked based speech recognition systems.
3/
3/
All powered by a coin battery that lasts a year. That brings standalone voice interfaces to all kinds of appliances (he also predicts "sensor applications for logistics, agriculture, and health" based on low-powered, standalone embedded processors.
https://petewarden.com/2020/11/14/why-do-i-think-there-will-be-hundreds-of-billions-of-tinyml-devices-within-a-few-years/
4/
https://petewarden.com/2020/11/14/why-do-i-think-there-will-be-hundreds-of-billions-of-tinyml-devices-within-a-few-years/
4/
The infrastructure for these systems is coming from multiple quarters. Take Mcunet, which "designs compact neural networks that deliver unprecedented speed and accuracy for deep learning on IoT devices, despite limited memory and CPU."
https://arxiv.org/pdf/2007.10319.pdf
5/
https://arxiv.org/pdf/2007.10319.pdf
5/
At the heart of Mcunet is Tinynas, which allows developers to tailor the size of their neural nets to the capabilities of small, low-powered processors, pruning away instructions for rarely invoked processes.
https://news.mit.edu/2020/iot-deep-learning-1113
6/
https://news.mit.edu/2020/iot-deep-learning-1113
6/
Early results are promising. Previous ML image classifiers of this scale capped out at 54% accuracy; with Mcunet, an MIT team brought that up to 70.7% - in a field where 1% accuracy bumps are considered impressive.
7/
7/
All of this is in service to a better version of machine learning, one where the classifier lives on your device and doesn't transmit observations to a vendor: surveillance-free decision-support.
Cryteria (modified)
https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY:
https://creativecommons.org/licenses/by/3.0/deed.en
eof/
Cryteria (modified)
https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY:
https://creativecommons.org/licenses/by/3.0/deed.en
eof/