The content of this page is still being written.
Simple machine-learning gesture detection with an Arduino Arduino Nano 33 BLE Sense.
Links
- View the GitHub repo: LINK
Aims
- Collect movement gesture data from Arduino
- Detect movement patterns with an ml model
- Real-time inference
- Deploy model onto Arduino
Next Steps
- Refine gestures, incorporate more complex gestures
- Make gestures trigger events in applications
- Apply optimisation techniques (mixed precision inference, etc)
- Make it work without proximity activation
Learning Objectives/Takeaways
- Handling of MCUs/Arduino
- Working with models on the edge
- C++ + ml implementations in C++
Accelerometer and Gyroscope Data
Accelerometer and gyroscope data is printed to the Arduinoās serial. Local python script reads the serial and formats the data is required. This produces 6 features ā accel x, y, z + gyro x, y, z.
Data is only recorded when a condition with Arduinoās proximity sensor is true. It is intended for the userās finger to activate the recording of data by being placed on the proximity sensor.
Three events are initially recorded ā background, tilt/move down, tilt left. All recorded samples are clipped to a size of 100 iterations.
2 Layer Dense Model
A dense model with 2 layers is used for classification. The input layer has 600 neurons, one for each data value in a sample. Therefore, the samples are flattened from a shape of [100, 6] to a shape of [600].
Model is trained over a few epochs, reaching 100% accuracy on a test set.
Deployment
The forward pass of the dense model is implemented on the Arduino in C++. The weights of the PyTorch model are extracted and saved as arrays to a .h
file.
The Arduino takes the arrays from the .h
file and loads them into 2 dense layers. After each proximity activation and release, the model passes the collected input data through the first layer, then a ReLU activation function, and then the output layer.