In stock
View Purchasing OptionsProject update 2 of 3
In this tutorial, our partners at QuickLogic & SensiML build a Magic Wand application for detecting spells that can run entirely on the SparkFun Thing Plus - QuickLogic EOS S3 using SensiML Analytics Toolkit.
Enjoy!
The SparkFun Team
In this tutorial, we are going to build a magic wand which can recognize different spell incantations in real time using the SparkFun Thing Plus - QuickLogic EOS S3 microcontroller and SensiML Analytics Toolkit and TensorFlow Lite for Microcontrollers.
This demo implements the Magic Wand Demo developed for showcasing TensorFlow Lite for Microcontrollers. We will teach our device to recognize three gestures.
This tutorial focuses on gesture recognition, but these technologies apply to a variety of tinyML applications such as predictive maintenance, activity recognition, sound classification, and keyword spotting.
For every machine learning project, the quality of the final product depends on the quality of your curated data set. Time series sensor data, unlike image and audio, are often unique to the application as the combination of sensor placement, sensor type, and event type greatly affects the data created. Because of this, you will be unlikely to have a relevant dataset already available, meaning you will need to collect and annotate your dataset.
We are going to use the SensiML Data Capture Lab to collect and annotate data for the different gestures. We have created a template for this project which will get you started. The project has been pre-populated with the labels and metadata information, along with some pre-recorded examples. To add this project to your account:
The project will be synced to your SensiML account. In the next few sections, we are going to walk through how we used the Data Capture Lab to collect and label this dataset.
To enable wireless data capture, we are going to attach the SparkFun Thing Plus - ESP32 WROOM with this data streaming firmware to stream the IMU sensor data over Wi-Fi to the Data Capture Lab. If you don’t have the ESP32 available, you can use the serial data capture over USB as well. We have configured this sensor to capture IMU Accelerometer data at a sample rate of 100 Hz.
To set up the hardware for data collection:
To capture data in the Data Capture Lab over Wi-Fi:
You should now see the Accelerometer data streaming into the Data Capture Lab. See the documentation for detailed instructions for Wi-Fi streaming.
You should now see a live stream of the sensor data. Before you record data select the appropriate label and metadata for this capture. Maintaining good records of sensor metadata is critical to building a good machine learning model.
To start recording sensor data, hit begin record. When you hit stop recording, the sensor data will be saved locally to your machine as well as synced with the SensiML Cloud project.
Along with sensor data, it is also useful to capture video data to help us during the data annotation process. We record the video with our webcam and will sync it to the sensor data when we start labeling the dataset.
Now that we have captured enough sensor data, the next step is to add labels to the dataset to build out our ground truth. To add a label to your data:
You have now labeled a region of the capture file which we can use when building our model. The video below walks through how to label the events of a captured file in the SensiML Data Capture Lab.
We are going to use Google Colab to train our machine learning model using the data we collected from the SparkFun Thing Plus - QuickLogic EOS S3 in the previous section. Colab provides a Jupyter notebook that allows us to run our TensorFlow training in a web browser. Open the notebook here and follow along to train your own magic spell detection model.
For the next part of the tutorial, you will need to log into Analytic Studio, where we will validate and download the model that can be flashed to the device.
After you log into the Analytic Studio, click on your project to set it as the active project. Then, go to the Test Model tab. You can test against any of the captured data files as follows:
The model is compiled in the SensiML Cloud and the capture files are passed through it to emulate the edge device classifications against the selected sensor data
Now that we have validated our model, it is time for a live test. To build the firmware for your specific device, go to the Download Model tab of the Analytic Studio. To download the firmware for this tutorial:
NOTE: Instructions for flashing are found in the QORC SDK or here
Being able to rapidly iterate on your model is critical when developing an application that uses machine learning. To facilitate validating in the field, we provide the SensiML Open Gateway. The Gateway allows you to connect to your microcontroller over Serial, Bluetooth, or TCP/IP and see the classification results live as they are generated by the Knowledge Pack running on the microcontroller. Documentation on how to use the Open Gateway can be found here. The following clip shows the SparkFun Thing Plus - QuickLogic EOS S3 recognizing magic incantations in real time.
We hope you enjoyed this tutorial using the SensiML Analytics Toolkit. In this tutorial we have covered how to:
For more information about SensiML, visit https://sensiml.com. If you want our help to build your own application, please get in touch.
QuickLogic is a fabless semiconductor company that develops low power, multi-core MCU, FPGAs, and embedded FPGA Intellectual Property (IP), voice, and sensor processing. The Analytics Toolkit from our subsidiary, SensiML Corporation, completes the end-to-end solution by using AI technology to provide a full development platform spanning data collection, labeling, algorithm and firmware auto-generation, and testing. The full range of solutions is underpinned by open source hardware and software tools to enable the practical and efficient adoption of AI, voice, and sensor processing across mobile, wearable, hearable, consumer, industrial, edge, and endpoint IoT.