The content of this page is still being written.

Simple machine-learning gesture detection with an Arduino Arduino Nano 33 BLE Sense.

  • View the GitHub repo: LINK

Aims

  1. Collect movement gesture data from Arduino
  2. Detect movement patterns with an ml model
  3. Real-time inference
  4. Deploy model onto Arduino

Next Steps

  1. Refine gestures, incorporate more complex gestures
  2. Make gestures trigger events in applications
  3. Apply optimisation techniques (mixed precision inference, etc)
  4. Make it work without proximity activation

Learning Objectives/Takeaways

  1. Handling of MCUs/Arduino
  2. Working with models on the edge
  3. C++ + ml implementations in C++

Accelerometer and Gyroscope Data

Accelerometer and gyroscope data is printed to the Arduino’s serial. Local python script reads the serial and formats the data is required. This produces 6 features → accel x, y, z + gyro x, y, z.

Data is only recorded when a condition with Arduino’s proximity sensor is true. It is intended for the user’s finger to activate the recording of data by being placed on the proximity sensor.

Three events are initially recorded → background, tilt/move down, tilt left. All recorded samples are clipped to a size of 100 iterations.

2 Layer Dense Model

A dense model with 2 layers is used for classification. The input layer has 600 neurons, one for each data value in a sample. Therefore, the samples are flattened from a shape of [100, 6] to a shape of [600].

Model is trained over a few epochs, reaching 100% accuracy on a test set.

Deployment

The forward pass of the dense model is implemented on the Arduino in C++. The weights of the PyTorch model are extracted and saved as arrays to a .h file.

The Arduino takes the arrays from the .h file and loads them into 2 dense layers. After each proximity activation and release, the model passes the collected input data through the first layer, then a ReLU activation function, and then the output layer.