Adding a sensor backpack to a toy robot and using it to control a 3D model displayed in a browser.
This project uses a 3D printed ‘backpack’ containing an ESP32, a 9 axis movement sensor, battery and touch switch attached to a toy robot.
The ESP32 reads data from the movement sensor as the robot is rotated and tilted. This data is received by a web page served by the ESP32 and using the three.js JavaScript library a previously captured 3D model of the robot follows the movement.
Oh… and there’s 3D generated lightning!
For this project I used an Iron Man toy from eBay, an ESP32 board with microSD card reader, an MPU9250 9-axis board and a TTP223 capacitive touch button from AliExpress. Some cables and a spare microSD card. You also need a little 500mAh 43x25x8.5 mm LiPo if you want to power it from a battery. You can either print the robot backpack using a 3D printer or just use thick cardboard and some elastic bands.
You might prefer to use your own action toy or other item as the basis for the 3D model. To do this you need to turn your physical object into a 3D file to be displayed in a browser.
Creating a 3D Model from Photos
Creating the 3D model from photographs, often called photogrammetry involves taking many photos of your object from many different angles and using software to recreate the object in 3D.
There are many ways of doing this and for my project I used a combination of Regard3D, Meshmixer, Creators3D online viewer and Microsoft’s 3D Builder. The workflow is a little convoluted but worked for my application.
To start I took about 50 images of the robot from three different heights.
In Regard3D I used the following settings to create the 3D data:
Matches
Detector(s): AKAZE Threshold: 0.0001 Dist ratio: 0.9 Camera model: Pinhole radial 3 Matching algorithm: FLANN
Triangulation
New Incremental, MaxPair initilization, intrinsic camera parameters refined
Densification
CMVS/PMVS UseVis: no Level: 1 Cell size: 2 Threshold: 0.7 wsize: 7 Min image num: 3 Max cluster size: 100
Surface
Type: Posisson Parameters: Depth: 9 Samples per Node: 8.6 Point weight: 4 Trim threshold: 5 Colorization: Textures Color Params: GeomVisTest: yes Global seam level: yes Local seam level: yes Outlier removal: None
The result isn’t perfect but usable to start. There are many tutorials for Regard3D and possibly other settings will work better for your case.
At this point I was exporting the surface as a obj file to clean up in MeshLab but I couldn’t find any way of re-exporting it from Meshlab which included the textures.
I turned to Meshmixer and used Select > Unwrap Brush to clean all the extraneous material you can see in the Regard 3D screen above. I saved the cleaned up file in 3mf format because this format contains all the 3D data and textures in one file.
At this point your 3D model is probably the wrong way up and too small to be used in the second part of the project. I found Microsoft’s 3D Builder worked really well for re-orientating and enlarging the model using a combination of the rotation and movement controls, especially with the Object > Settle command to make the model sit flat.
Making the model similar to the dimensions below worked well for three.js
So the model rotates around the centre, set the location as below. You can see the model will sunk half way into the floor.
The model is now ready to be export as a GLB file and used in three.js but the filesize is large and will take ages to download and display from the SD card on the ESP32 board. The easiest solution I found to reduce the file size is the Creators 3D online viewer and converter found here: https://www.creators3d.com/online-viewer. Drag and drop the GLB file and choose the following options to export a new GLB file:
Assembling the Backpack
Once you have your 3D file ready you can assemble the backpack with the ESP32 and motion sensor. I used Tinkercad to create the parts and then just glued them together. You can see in the Tinkercad screen below that I used an STL export of the robot to make a robot shaped hole in the backpack so it fits over the robot.
The 3D prints look like this:
Assembled, the backpack looks like this:
The Arduino Code
The Sketch consists of two parts. The first part reads from the movement sensor and converts it into calibrated data. The second part is a webserver hosting a web page. The webpage contains JavaScript including the three.js library that receives the calibrated data over websockets and updates the position of the 3D model.
There’s several Arduino libraries for the MPU-9250. I used the refactored version of kriswiner library found here: https://github.com/hideakitai/MPU9250. This can be installed using the Arduino Library Manager:
To increase the sensitivity level of the readings. I changed line 20 in Documents\Arduino\libraries\MPU9250\mPU9250.h to:
template <typename WireType, AFS AFSSEL = AFS::A2G, GFS GFSSEL = GFS::G250DPS, MFS MFSSEL = MFS::M16BITS>
The ESPAsyncWebserver library also needs to be installed by downloading the ‘Download ZIP’ link and in the IDE installing it with Sketch > Include Library > Add .ZIP Library.
Connect the ESP32 and MPU9250 as below, copy the files in the SD folder from the Github repository here – https://github.com/robotzero1/esp32-mpu9250-three.js onto the SD card, insert it and connect via USB. Before assembling the backpack, the MPU9250 should be calibrated.
The first time you upload the script make sure you have the Serial Monitor open in the Arduino IDE because a calibration script will start. During the first part of calibration the MPU9250 sensor needs to be flat. During the second part, move it in a figure of 8 movement using the full extent of your arm.
After the calibration has completed you can assemble the backpack.
The project is now complete. If you open the IP address shown in the Serial Monitor or on your router’s list of connections in the browser you will see the 3D model of the robot. The on-screen robot should follow the movements of the physical robot and pressing the button on the backpack should result in lightning between the robot’s hands.
Project Demonstration
Here’s a quick demo of the 3D robot following the movement of the physical robot:
I’ve uploaded another demo but without the ESP32 and sensor so you can see the effect without building the project. https://robotzero.one/examples/threejs/robot/orbit-control.html Click on the robot to move it.
The three.js Library and ESPAsyncWebserver
If you want to recreate the code from scratch and not use the files in the SD card folder you can install three.js in a few ways . For me the easiest way was to download the zip file from Github, extract it and copy the build and examples folders to my app folder and refer to them like this in my HTML file:
import * as THREE from './build/three.module.js';
import { GLTFLoader } from './examples/jsm/loaders/GLTFLoader.js';
For a simple application with only a few dependencies the ESP32 works fine but I discovered that the lightning code requested too many JS files for ESP32 webserver to cope with so I used Parcel to bundle all the JS files into one file and this only requires one request:
webserver.on("/parcel.e31bb0bc.js", HTTP_GET, [](AsyncWebServerRequest * request) {
request->send(SD_MMC, "/parcel.e31bb0bc.js", "application/javascript");
});
I found Parcel easier to set up and use than Webpack. The Github repository contains the files that I used to build the bundled script file here: https://github.com/robotzero1/esp32-mpu9250-three.js/tree/master/parcel
MPU-9250 Libraries for the ESP32
There’s quite a few MPU-9250 libraries for the Arduino IDE. Some of them work with the ESP32.
Probably the most comprehensive library which includes sensor fusion algorithms – https://github.com/kriswiner/MPU9250
Port of the kriswiner’s library for the GY-21 (the board I have) to include the BMP280 sensor – https://github.com/TheChapu/GY-91
Refactored version of kriswiner library (used in this project) – https://github.com/hideakitai/MPU9250
An ESP32 library that allows control over the 9250’s DMP features – https://github.com/rupin/SparkFun_MPU-9250-DMP_Arduino_Library .
An alternative library – https://github.com/bolderflight/MPU9250/
Easiest library I found – https://github.com/asukiaaa/MPU9250_asukiaaa just works with an ESP32 but doesn’t have ‘fusion’. Need to run the GetMadOffset sketch to see magnet data.
For my purposes it came down to choice between the refactored kriswiner library and using the correction algorithms in the library code or using the ported Sparkfun library where the correction algorithms are done on the chip.
Other Sensor Options
BNO080 replacement for the not well received 005 devices – https://es.aliexpress.com/item/32915488261.html (and the library – https://github.com/jps2000/BNO080)
Links
Great article about 9 D0F sensors – https://github.com/kriswiner/MPU6050/wiki/Affordable-9-DoF-Sensor-Fusion
Buy Me A Coffee
If you found something useful above please say thanks by buying me a coffee here...