Features
Reference
Gesture Recognition
Our gesture-controlled home automation system uses advanced machine learning algorithms to map hand movements into precise control signals, enabling seamless and intuitive smart device automation.
System Architecture
The system integrates hardware and software components for efficient data capture, processing, and execution of smart device commands:
- Fingerprint authentication on the ESP32 unlocks gesture control, enhancing security.
- The MPU6050 sensor captures motion data from wrist movements on the wearable.
- A session timeout mechanism ensures automatic lockout after inactivity.
- Motion data is processed locally using an Edge Impulse TinyML model on the ESP32 to classify gestures.
- Recognized gestures are published as MQTT messages to AWS IoT Core.
- Smart home devices (lights, locks, sockets) subscribed to MQTT topics respond instantly to gestures.
- AWS IoT Core triggers AWS Lambda functions which update Firebase with new device states.
- The Flutter mobile app listens to Firebase in real time to reflect device status updates in the UI.
- User-triggered actions from the mobile app update Firebase directly (e.g., toggle light).
- Google Cloud Functions detect these Firebase updates and publish corresponding MQTT messages back to IoT Core.
- This ensures mobile commands reach the smart devices via the same MQTT pipeline without loops or conflicts.
Key Features
- Hands-free operation with intuitive gestures.
- Dual authentication for door locks (fingerprint and gesture-based).
- Real-time device control and status updates via a Flutter mobile app.
- Secure communication using MQTT and AWS IoT Core.
Hardware Components
Component | Description |
---|---|
ESP32 Dev Module | Processes sensor data, runs the Edge Impulse ML model, and communicates with AWS IoT Core via MQTT. |
MPU6050 Sensor | Captures precise motion data for gesture recognition. |
Finger Print Door lock | Enables secure fingerprint-based door unlocking. |
Smart Sockets & Light Modules | Controlled via ESP32 to automate appliances and lighting based on gesture commands. |
Fingerprint scanner R502 | Authenticate the user by fingerprint in the wrist band. |
Software Components
Component | Description |
---|---|
Edge Impulse ML Model | Classifies gestures in real-time on the ESP32 microcontroller. |
MQTT Communication | Enables reliable message delivery between ESP32 and AWS IoT Core. |
AWS IoT Core | Acts as the central MQTT broker for managing device commands. |
Firebase | Provides real-time database updates and role-based access control (RBAC). |
Flutter Mobile App | Displays real-time device status, logs, and gesture configuration settings. |
GCP Function | Catch the firebase triggers done by mobile app and publish them to IOT core. |
System Workflow
- Gesture Captured: MPU6050 records wrist movements and sends data to the ESP32.
- Gesture Classified: Edge Impulse ML model identifies the gesture and maps it to a specific command.
- MQTT Message Published: ESP32 sends the command to AWS IoT Core via MQTT.
- Device Receives Command: Subscribed devices (door lock, smart socket, or light socket) perform the corresponding action.
- Firebase & UI Update: AWS Lambda updates Firebase, and the Flutter app reflects the real-time status.
Budget Breakdown
Item | Quantity | Unit Cost (LKR) | Total (LKR) |
---|---|---|---|
Speed Xiao ESP32 Board | 1 | 3,200 | 3,200 |
esp32 dev module Board | 4 | 2,400 | 9,600 |
IMU Sensor | 1 | 1,000 | 1,000 |
R502 Finger Print Sensors | 1 | 6,300 | 6,300 |
Battery Pack | 1 | 200 | 200 |
Plug Sockets | 1 | 1,000 | 1,000 |
Electronic Door Lock | 1 | 2,500 | 2,500 |
230V to 5V Converters | 4 | 300 | 1,200 |
Relays, Triacs, Resistors, etc. | 4 | 300 | 1,200 |
Wires, Soldering Components | 1 | 3,000 | 3,000 |
Other Expenses | 1 | 2,000 | 2,000 |
Flexible 3d print model as Wearable Band | 1 | 1,800 | 1,800 |
Total | 33,000 |
Conclusion
This design ensures a seamless, low-latency, and secure gesture-controlled home automation experience. By integrating ESP32 NodeMCU, fingerprint scanner, smart sockets, and MQTT-based communication, our system enables effortless automation and real-time device control.
Future Developments
- Integration with voice assistants like Alexa and Google Assistant.
- Expansion to support more devices and appliances.
- Enhanced machine learning models for improved gesture recognition accuracy.
Commercialization Plans
- Partnering with smart home device manufacturers for integration.
- Launching a subscription-based service for advanced features.
- Expanding to international markets with localized support.
GitHub Repository
Explore our project on GitHub: FlickNest GitHub Repo