Robot Arm Ecosystem
A comprehensive EMG-controlled robotic platform combining embedded firmware, machine learning, and mobile app development — funded through the YTÜ Yıldız Kaşifleri innovation program.
Overview
Secured funding to develop a complete robot arm control ecosystem featuring:
- Embedded control firmware (C++) for precise servo actuation
- React Native companion app for real-time operation and monitoring
- Machine learning pipeline mapping EMG bio-signals to motion
Architecture
Embedded Firmware (C++)
├─ Inverse Kinematics solver
├─ Servo control protocol
├─ Serial communication interface
└─ Safety & calibration modes
React Native App
├─ Live EMG visualization
├─ Control panel (mode, sensitivity)
├─ Calibration workflow screens
└─ Haptic / visual feedback
Python Backend
├─ EMG signal acquisition & filtering
├─ Feature extraction (RMS, thresholds)
├─ Random Forest gesture classification
└─ Serial bridge to embedded system
Key Contributions
- Designed embedded control firmware (C++) with precise servo actuation
- Implemented Inverse Kinematics for natural arm movement
- Trained Random Forest models to map EMG bio-signals to servo motion
- Built Python serial communication bridge synchronizing mobile and embedded systems
- Developed intuitive React Native UI for real-time control and feedback
Technical Highlights
- Bio-signal processing: Notch filtering, bandpass filtering, feature extraction
- ML pipeline: Gesture classification with confidence thresholds
- Real-time sync: Low-latency communication between app, Python backend, and firmware
- Safety features: Emergency stop, calibration modes, sensitivity controls
Impact
Demonstrates end-to-end integration of biosignal processing, embedded systems, and mobile development for accessible human-machine interaction.