A wearable embedded system designed to assist deaf-blind individuals in communication by translating hand gestures into audible speech and tactile feedback.
To bridge the communication gap for deaf-blind individuals by providing a real-time, intuitive, and accessible interaction system.
- Controller: ESP32-S3 microcontroller
- Sensors: MPU6050 IMU, Flex Sensors
- Communication: Bluetooth Low Energy (BLE)
- Output: DFPlayer Mini (audio), vibration motors (tactile feedback)
- Hand gestures are captured using flex sensors and IMU
- Sensor data is processed by ESP32-S3
- Gestures are mapped to predefined alphabets/commands
- Output is generated as:
- Audible speech (via speaker)
- Tactile vibration feedback
- Real-time gesture recognition
- Wearable and portable system
- Dual feedback (audio + vibration)
- Wireless communication using BLE
- Embedded system design
- Sensor fusion and calibration
- Wearable technology development
- Real-time processing and control
- Assistive technology for disabled individuals
- Human-computer interaction systems
- Wearable IoT devices
This project demonstrates practical implementation of embedded systems, wearable technology, and real-time gesture-based communication for impactful real-world applications.