Motion Identification with Arduino Machine Learning

In this Arduino Machine Learning project, we will use an accelerometer sensor to identify your movements. This content is actually a reconstruction of a project on the Tensorflow blog. Unlike the writing on the Tensorflow block, we'll use the older generation and the weaker Arduino Nano instead of the Arduino Nano 33 BLE. Arduino Nano, 32kb flash and onlyIt is a development card equipped with 2 kb of RAM.

Landlimit,99% accuracy

Description of Properties

To understand what movement we are making, we will use accelerations along 3 axes (X, Y, Z) from an IMU, that is, the accelerometer sensor. With NUM_SAMPLES, we will record a fixed number of transactions starting from the first motion detection.

This means that our feature vectors will be large enough to fit in the memory of the Arduino Nano NUM_SAMPLES 3*. Wewill start with a low value to keep NUM_SAMPLES as lean as possible. If your classifications suffer from insufficient accuracy, you can increase this number.

Saving Sample Data

Reading Data From Sensor (IMU)

First of all, we need to read the raw data from the sensor.This piece of code will differ depending on the specific sensor you are using.We will use the 9-axis "MPU9250" sensor alternatively you can use a different acceleration sensor or the MPU6050 sensor, which is the same family member. To keep things simple and understandable, we will perform the sensor installation and reading process in 2 functions: imu_setupand imu_read.

You can see several sample applications for MPU6050 and MPU 9250.No matter what code you use, you must save the sensor code you use to call in the project's main code in a file named imu.h. In addition, keeping all the codes and files in one folder will make your job easier.

For MPU6050

#include "Wire.h"// library https://github.com/jrowberg/i2cdevlib/tree/master/Arduino/MPU6050#include"MPU6050.h"#define OUTPUT_READABLE_ACCELGYRO

MPU6050 imu;

void imu_setup() {
    Wire.begin();
    imu.initialize();
}

void imu_read(float *ax, float *month, float *less) {
    int16_t _ax, _ay, _az, _gx, _gy, _gz;

imu.getMotion6(&_ax, &_ay, &_az, &_gx, &_gy, &_gz);

*ax = _ax;
    *month = _ay;
    *less = _az;
}



For MPU9250

#include "Wire.h"// library https://github.com/bolderflight/MPU9250#include"MPU9250.h"

MPU9250 imu(Wire, 0x68);

void imu_setup() {
    Wire.begin();
    imu.begin();
}

void imu_read(float *ax, float *month, float *less) {
    imu.readSensor();

*ax = imu.getAccelX_mss();
    *month = imu.getAccelY_mss();
    *less = imu.getAccelZ_mss();
}


In the main .ino file, we pour the sensor values into the serial monitor / plotter:

#include #define NUM_SAMPLES "imu.h"30#define NUM_AXES 3// sometimes you can get "spikes" in readings// you can set a sensible value to cut very large values#define TRUNCATE_AT20double features[NUM_SAMPLES *NUM_AXES];

void setup() {
    Serial.begin(115200);
    imu_setup();
}

void loop() {
    float ax, month, less;

imu_read(&ax, & month, &az);

ax = constrain(ax, -TRUNCATE_AT, TRUNCATE_AT);
    month = constrain(month, -TRUNCATE_AT, TRUNCATE_AT);
    less = constrain(less, -TRUNCATE_AT, TRUNCATE_AT);

Serial.print(ax);
    Serial.print('t');
    Serial.print(month);
    Serial.print('t');
    Serial.println(less);
}








Open the serial plotter and move it a little to get an idea of the range of your readings:

Calibration

Due to gravity, we get a constant value of -9.8 on the static Z axis (you can see this from the moving image at the top).To eliminate this constant value, we need to create a offset of 9.8, so that it is not affected by the moving gravity made on the Z axis.

double baseline[NUM_AXES];
double features[NUM_SAMPLES * NUM_AXES];

void setup() {
    Serial.begin(115200);
    imu_setup();
    calibrate();
}

void loop() {
    float ax, month, less;

imu_read(&ax, & month, &az);

ax = constrain(ax - baseline[0], -TRUNCATE, TRUNCATE);
    month = constrain(month - baseline[1], -TRUNCATE, TRUNCATE);
    az = constrain(az - baseline[2], -TRUNCATE, TRUNCATE);
}

void calibrate() {
    float ax, month, less;

for (int i = 0; i <  10; i++) {
        imu_read(&ax, & month, &az);
        delay(100);
    }

baseline[0] = ax;
    baseline[1] = month;
    baseline[2] = less;
}

After Z-axis calibration, the serial plotter will look like this:

Detecting The First Move

Now we need to check for movement.To keep it simple, we will use a pure approach that will look for a high value in acceleration: if a threshold is exceeded, it will mean that a movement begins.

If you have done the calibration step, a threshold of 5 should work well.If you have not calibrated, you need to find a value that suits your needs.

#include imu.h#define ACCEL_THRESHOLD 5void loop() {
    float ax, month, less;

imu_read(&ax, & month, &az);

ax = constrain(ax - baseline[0], -TRUNCATE, TRUNCATE);
    month = constrain(month - baseline[1], -TRUNCATE, TRUNCATE);
    az = constrain(az - baseline[2], -TRUNCATE, TRUNCATE);

if (!motionDetected(ax, month, less)) {
        delay(10);
        return;
    }
}

bool motionDetected(float ax, float month, float less) {
    return (abs(ax) + abs(month) + abs(az)) > ACCEL_THRESHOLD;
}




Record Optimization

If there is no action, we do not take any action and we continue to monitor it.If there's movement, we'llprint the next NUM_SAMPLES readings to the serial monitor.

void loop() {
    float ax, month, less;

imu_read(&ax, & month, &az);

ax = constrain(ax - baseline[0], -TRUNCATE, TRUNCATE);
    month = constrain(month - baseline[1], -TRUNCATE, TRUNCATE);
    az = constrain(az - baseline[2], -TRUNCATE, TRUNCATE);

if (!motionDetected(ax, month, less)) {
        delay(10);
        return;
    }

recordIMU();
    printFeatures();
    delay(2000);
}

void recordIMU() {
    float ax, month, less;

for (int i = 0; i < NUM_SAMPLES; i++) {
        imu_read(&ax, &ay, &az);

        ax = constrain(ax - baseline[ 0], -TRUNCATE, TRUNCATE);
        month = constrain(month - baseline[1], -TRUNCATE, TRUNCATE);
        az = constrain(az - baseline[2], -TRUNCATE, TRUNCATE);

features[i * NUM_AXES + 0] = ax;
        features[i * NUM_AXES + 1] = month;
        features[i * NUM_AXES + 2] = less;

delay(INTERVAL);
    }
}
void printFeatures() {
    const uint16_t numFeatures = sizeof(features) / sizeof(float);
    
for (int i = 0; i < numFeatures; i++) {
        Serial.print(features[i]);
        Serial.print(i == numFeatures -  1 ? 'n' : ',');
    }
}

Let's save 15-20 instances for each transaction and save one for each transaction to a single file.Since we deal with multidimensional data, you need to collect as many samples as possible to average the noise.

Training and Exporting the Classifier

If the code below doesn't mean anything to you; If you want to use a python or C++ trained model on Arduino or other development cards, you can learn the details in this article.

from sklearn.ensemble import RandomForestClassifier
from micromlgen import port

# put your instances in the dataset folder# one class per file# One propertyvector per line in CSV format
features, classmap = load_features('dataset/')
X, y = features[:, :-1], features[:, -1]
classifier = RandomForestClassifier(n_estimators=30, max_depth=10).fit(X, y)
c_code = port(classifier, classmap=classmap)
print(c_code)


At this point, you need to copy the printed code and save it as model.h in your Arduino project folder.

Unlike previous and simpler ones, we can't easily achieve 100% accuracy in this project on machine learning.The movement is quite noisy, so you should try a few parameters for the classifier and choose the ones that perform best.Let's show a few examples:

Decision limits of 2 PCA components of sensor properties, linearkernel

Decision limits of 2 PCA components of sensor properties,Polynomial kernel

Decision limits of 2 PCA components of sensor properties, RBF core (kernel), 0.01 gamma

Decision limits of 2 PCA components of sensor properties, RBF core (kernel), 0.001 gamma

Choosing the Right Model

Now that we have chosen the best model, we must transfer it to code C.Here comes the problem: not all models will fit on the development card.

The core of the SVM (Support Vector Machines) are support vectors: each trained classifier will be characterized by a certain number.The problem is that if there are too many, the generated code will be too large to fit in the microprocessor's flash.

Therefore, instead of choosing the best model for accuracy, you should sort from best performer to worst.For each model, starting from the beginning, you should transfer it to your Arduino project and try to compile it: if it fits, you can use it without problems.Otherwise, you should choose the next one and try again.

It may seem like a tedious process, but keep in mind that we are trying to extract a class from 90 features in 2 Kb RAM and 32 Kb flash.

Here are a few numbers for the different combinations we tested:

KernelCGammaDegreesVectorsFlash sizeRAM (b)Avg. accuracy
RBF100.0013753 Kb1228%99
Poly1000.00121225 Kb1228%99
Poly1000.00132540 Kb1228%97
Linear5014055 Kb1228%95
RBF1000.016180 Kb1228%95

As you can see, we achieved very high accuracy in the test set for all classifiers: we used only one in arduino nano.Of course, if you use a larger microprocessor, you can also use others.

Reminder

As a side note, take a look at the RAM column, all values are equal. This is because it is independent of the number of support vectors in the application and depends only on the number of properties.

Running Inference

#include "model.h"

void loop() {
    float ax, month, less;

imu_read(&ax, & month, &az);

ax = constrain(ax - baseline[0], -TRUNCATE, TRUNCATE);
    month = constrain(month - baseline[1], -TRUNCATE, TRUNCATE);
    az = constrain(az - baseline[2], -TRUNCATE, TRUNCATE);

    if (!motionDetected(ax, month, less)) {
        delay(10);
        return;
    }

recordIMU();
    classify();
    delay(2000);
}

void classify() {
    Serial.print("Detected transaction: ");
    Serial.println(classIdxToName(predict(features)));
}

We've finished all the work and the work. You can now classify transactions with Arduino Nano and 2 Kb RAM. Without long neural networks, Tensorflow, 32-bit ARM processors, we performed a 97% SVM-based machine learning with an 8-bit microprocessor.

Targeting Arduino Nano (older generation), the program requires 25310 bytes (82%) of program space and 1228 bytes (59%) of RAM.This means that you can run machine learning in even fewer areas than Arduino Nano provides.So we have shown that the answer to the question of whether I can run machine learning on Arduino is clearly YES.