
Machine learning (ML) on low-power edge devices is fundamentally changing the wearables market and developing at a phenomenal pace. So far, ML has been mostly focused on image and text processing but we see huge potential in applying it to biometric data. Edge ML (sometimes referred to as Tiny ML) allows small, low-power devices to make ‘intelligent’ inferences from biometric sensors in a way that was not possible until very recently. This is a tipping point for health and wellbeing wearables.
We recently used ARM’s uTensor to make inferences from live sensor data microcontrollers, focusing on Cortex-M4s. While the accuracy was promising, we found power consumption to be a key concern.
When processing real-time data, the best accuracy is achieved when processing as much of the data as often as possible, using the largest possible model. This high CPU utilisation becomes a significant concern for battery-powered wearable devices. We’re excited about developments in edge ML libraries because more efficient models, optimised inference and making intelligent trade-offs can all help to reduce CPU utilisation and therefore power consumption.
Last week, two behemoths of machine learning announced they will be consolidating their efforts to bring neural networks to edge devices in a single platform. uTensor (also known as microTensor), ARM’s early entrant into edge ML, takes TensorFlow models and compiles into highly efficient code for edge processing. Google’s TensorFlow Lite for Microcontrollers is a port of the popular TensorFlow Lite which optimises TensorFlow models for constrained devices, including its own model interpreter. This means more of the progress that’s already been in machine learning will be available to those of us processing data on the edge.
Exactly how the two projects will merge remains to be seen but we’re already excited by the direction in which they are heading. The official announcement suggests that tighter integration between the target device and model training could be key.
There’s more to be positive about: developers won’t have to choose which of the two platforms to use. Soon there will be one clear choice, allowing developers to share more code and models than ever before.
The ability to share a platform and ‘talk the same language’ becomes increasingly important as development of edge machine learning accelerates. More importantly, this a significant commitment from both ARM and Google to work together on tools that the rest of the community benefits from too.
Our focus is on making human-centred, intelligent wearable devices, and machine learning is becoming a big part of that. If you’re also interested in applying machine learning to wearable devices get in touch.