Guest: Pete Warden
Title: Machine Learning Everywhere
Abstract: When I first joined Google in 2014, I was amazed to discover they were using 13 kilobyte neural network models to recognize "OK Google" on tiny embedded chips on Android phones. This felt like deep magic, and it made me wonder how many other problems these kinds of miniscule ML models could solve. Over the past few years I've been helping Google ship products using this approach with TensorFlow Lite Micro, and helped external developers create new applications. While it's still early days for "TinyML", we're already seeing interesting impacts on how engineers compose systems, including software-defined sensors, cascades of ML models, air-gapped ambient computing, and ubiquitous on-device voice interfaces. In this talk I'll cover the past, present, and future of embedded ML systems.
Bio: Pete Warden is the technical lead of TensorFlow Lite Micro, Google's open source embedded machine learning framework. He was previously CTO and founder of Jetpac, acquired in 2014, and is the author of the TinyML O'Reilly book. He blogs at
petewarden.com, and is @
petewarden on Twitter.