Self-Balancing Robot

Two-wheel self-balancing platform with IMU sensor fusion, PID control, and ROS2 nodes for mapping and person-following.

See Gallery

Overview

Built as a team project: I led system integration and implemented key modules (MQTT (Raspberry PI), computer vision, ROS2 SLAM, Mechanical styling).

Project Goals
  • Stable balancing via IMU (gyro+accel) with complementary/kalman fusion.
  • Tunable PID for tilt → motor PWM with safe saturation and anti-windup.
  • ROS2 nodes: sensor stream, command, SLAM (LiDAR + Cartographer).
  • Person detection + follow mode (vision or LiDAR tracking).
Role
  • Team manager: I deployed organisational tool and methods to optimize efficiency and smoothen teamwork: Gantt charts, interative collaborative diagrams, agile strategy.
  • Computer vision: I implemented haarcascards and Mediapipe computer vision for person following
  • I integrated ROS2 with a RPLIDA A1M8 to allow live mapping
  • I contributed to the mechanical design, adding eyes and hat design to make the design stand out
Key Specs
Chassis
Edited from given base and 3D printed
Motors
12V DC gearmotors + encoders
IMU
MPU-6050 / ICM-20948
Controller
ESP32 / STM32 (PID 1 kHz)
Companion
Raspberry Pi 5 (ROS2)
Sensing
2D LiDAR (Cartographer)
Power
3S Li-ion with BMS
Software
C++/Python, ROS2, MQTT

Demo Videos

Balancing tests, mapping runs, and follow-mode.

Notes - Results

ROS2 Integration
  • Despite the need to cut some ROS2 features we had wished to implemnt (Nav2), the project was highly successful (1st class) and was the only project implementing a creative design, as seen in the last video.
  • Comparison was made by a team member of the difference in performance between Mediapipe and Yolo V8. Yolo V8 appeared to be slightly more efficienct but highly more demanding in resources. Mediapipe was implemented. Live UI camera was diabled to prevent power spikes.