Welcome to
FlickNest

Transforming intuitive hand gestures into commands for controlling smart devices.
Hands-free, inclusive, and privacy-focused automation.

FlickNest

Gesture-Controlled Smart Automation

Introduction

Smart home automation has transformed the way we interact with our living spaces, but traditional control methods like smartphone apps and voice commands often fall short in terms of efficiency, accessibility, and convenience. Our project introduces a gesture-controlled home automation system, enabling users to seamlessly control devices using simple hand movements. By integrating machine learning, IoT, and cloud computing, we provide an intuitive, hands-free solution for smart living.

Real-World Problem & Solution

Existing smart home control systems lack inclusivity for individuals with physical disabilities, elderly users, or those engaged in activities where accessing a phone or voice assistant is inconvenient. Moreover, many gesture-based systems depend on external cameras, limiting their reliability and practicality for home automation.

Our solution utilizes a wearable FlickNest band equipped with an MPU6050 sensor and ESP32 microcontroller to capture and process hand gestures. The data is processed locally using Edge Impulse’s TinyML model, then transmitted via MQTT to AWS IoT Core, where a Lambda function updates Firebase in real time. This allows connected devices—such as smart locks, lights, and appliances—to respond instantly. A Flutter mobile app keeps users informed of device states, ensuring a seamless and intelligent home automation experience.

Impact

🔹 Enhanced Accessibility – Empowers individuals with disabilities by providing a hands-free way to interact with smart devices.
🔹 Greater Convenience – Eliminates the need for phones or voice assistants, offering faster and more natural smart home control.
🔹 Improved Security – Enables gesture-based authentication for secure control of smart locks and home automation systems.
🔹 Energy Efficiency – Reduces power consumption by allowing users to control lights and appliances with simple gestures.
🔹 Scalability & Adaptability – Supports multiple environments, making it ideal for homes, offices, and industrial applications.

By combining AI-driven gesture recognition, IoT connectivity, and cloud-based automation, this project is redefining the future of smart living with a highly responsive, secure, and inclusive solution.

System Architecture

System Architecture

The system runs tiny ML gesture classification locally on the ESP32 using Edge Impulse, enabling offline detection and low-latency control. Detected gestures are published via MQTT to AWS IoT Core. AWS Lambda functions then update Firebase in real-time. For frontend-initiated state changes, Firebase triggers a Google Cloud Function, which publishes updates to ESP32 without feedback loops. Home automation devices subscribe to these MQTT topics for instant response. This event-driven, serverless architecture ensures fast, secure, and scalable smart home control..

Data Path

Gesture Recognition

The data flow begins with MPU6050 capturing motion, which the ESP32 processes using Edge Impulse's TinyML model to classify gestures. Before any gesture control, fingerprint authentication is required on the band, adding a layer of user-specific security. Once authenticated and a valid gesture is detected, the ESP32 publishes data via MQTT. Smart devices and AWS IoT Core subscribe to this topic. AWS IoT Core triggers a Lambda function to update Firebase, while Google Cloud Functions handle mobile-triggered updates by publishing to MQTT—avoiding feedback loops. Session timeouts on ESP32 ensure access is revoked after inactivity. The Flutter app stays in sync via Firebase, and devices react instantly to MQTT messages, achieving secure, low-latency home automation.

Testing

We conducted hardware, software, manual, and integration tests to ensure the reliability of our gesture-controlled home automation system.

1. Hardware Testing

• Verified gesture motion capture using MPU6050 & ESP32.

• Ensured high accuracy (>90%) in gesture recognition with Edge Impulse.

• Measured response time for real-time device control.

2. Software Testing

• Used Flutter Fix to resolve UI issues and tested real-time Firebase synchronization.

• Validated MQTT communication using aws mqtt test client, ensuring stable, low-latency messaging.

• Tested AWS Lambda function, confirming efficient data processing and Firebase updates.

• Tested the GCP Firebase Cloud Function to confirm that frontend inputs trigger the appropriate Firebase updates.

3. Manual & Integration Tests

• Performed 100+ gesture recognition tests to validate ML accuracy.

• Conducted end-to-end system tests, ensuring seamless interaction between ESP32, MQTT, AWS, Firebase,GCP and Flutter.

• Tested network stability, multi-user scenarios, and smart device compatibility.