EMG Robot Arm Companion App
Prototype companion application for an EMG-controlled robotic arm. Bridges biosignal processing (Python) with an intuitive mobile UI (React Native) to deliver responsive, accessible control.
Sponsored / supported through the Yıldız Kaşifleri program organized by YTÜ Teknopark, providing mentorship and innovation backing during prototyping.
Objectives
- Provide real-time visualization & control feedback
- Abstract low-level EMG signal handling from user
- Ensure consistent multi-platform (Android/iOS) UX
Architecture
React Native App
├─ Live EMG value stream (WebSocket/BLE)
├─ Control panel (mode, sensitivity)
├─ Calibration workflow screens
└─ Haptic / visual feedback layer
Python Backend
├─ Signal acquisition & filtering
├─ Feature extraction (RMS, thresholds)
├─ Gesture classification / intent mapping
└─ Control command relay to actuator layer
Key Features
- Real-time EMG waveform & classification overlay
- Calibration wizard for muscle groups
- Adaptive sensitivity & noise filtering controls
- Safety lock state & emergency stop UI
Future Enhancements
- Offline model quantization
- Cloud session sync for multi-device calibration
- Haptic profile customization
Learnings
Improved understanding of biosignal noise handling, UI latency constraints, and classification feedback loops for user trust.