Welcome to
FlickNest

Transforming intuitive hand gestures into commands for controlling smart devices.
Hands-free, inclusive, and privacy-focused automation.

FlickNest

Gesture-Controlled Smart Automation

Introduction

Smart home automation has transformed the way we interact with our living spaces, but traditional control methods like smartphone apps and voice commands often fall short in terms of efficiency, accessibility, and convenience. Our project introduces a gesture-controlled home automation system, enabling users to seamlessly control devices using simple hand movements. By integrating machine learning, IoT, and cloud computing, we provide an intuitive, hands-free solution for smart living.

Real-World Problem & Solution

Existing smart home control systems lack inclusivity for individuals with physical disabilities, elderly users, or those engaged in activities where accessing a phone or voice assistant is inconvenient. Moreover, many gesture-based systems depend on external cameras, limiting their reliability and practicality for home automation.

Our solution utilizes a wearable FlickNest band equipped with an MPU6050 sensor and ESP32 microcontroller to capture and process hand gestures. The data is processed locally using Edge Impulse’s TinyML model, then transmitted via MQTT to AWS IoT Core, where a Lambda function updates Firebase in real time. This allows connected devices—such as smart locks, lights, and appliances—to respond instantly. A Flutter mobile app keeps users informed of device states, ensuring a seamless and intelligent home automation experience.

Impact

🔹 Enhanced Accessibility – Empowers individuals with disabilities by providing a hands-free way to interact with smart devices.
🔹 Greater Convenience – Eliminates the need for phones or voice assistants, offering faster and more natural smart home control.
🔹 Improved Security – Enables gesture-based authentication for secure control of smart locks and home automation systems.
🔹 Energy Efficiency – Reduces power consumption by allowing users to control lights and appliances with simple gestures.
🔹 Scalability & Adaptability – Supports multiple environments, making it ideal for homes, offices, and industrial applications.

By combining AI-driven gesture recognition, IoT connectivity, and cloud-based automation, this project is redefining the future of smart living with a highly responsive, secure, and inclusive solution.

System Architecture

System Architecture

The system leverages Edge Impulse's ML capabilities to classify gestures locally on the ESP32, reducing latency and reliance on cloud processing. Using MQTT as the primary communication protocol, the ESP32 efficiently transmits gesture data to AWS IoT Core, ensuring seamless integration with cloud services. AWS Lambda functions process incoming data and update Firebase, enabling real-time synchronization with the Flutter frontend. The home automation devices, such as smart locks, lights, and appliances, act as MQTT subscribers, allowing immediate response to recognized gestures. This architecture ensures a scalable, low-latency, and secure IoT ecosystem for home automation.

Data Path

Gesture Recognition

The data flow begins with MPU6050 capturing motion data, which the ESP32 processes using Edge Impulse's TinyML model to classify gestures. Once a valid gesture is detected, the ESP32 publishes the processed data to an MQTT broker, where multiple subscribers, including AWS IoT Core and smart devices, receive updates. AWS IoT Core routes the data to an AWS Lambda function, which updates Firebase in real time. The Flutter app listens for Firebase updates, ensuring UI synchronization with device states. Meanwhile, smart devices subscribed to the MQTT broker react instantly, enabling responsive home automation.

Testing

We conducted hardware, software, manual, and integration tests to ensure the reliability of our gesture-controlled home automation system.

1. Hardware Testing

• Verified gesture motion capture using MPU6050 & ESP32.

• Ensured high accuracy (>85%) in gesture recognition with Edge Impulse.

• Measured response time for real-time device control.

2. Software Testing

• Used Flutter Fix to resolve UI issues and tested real-time Firebase synchronization.

• Validated MQTT communication using test clients, ensuring stable, low-latency messaging.

• Tested AWS Lambda function, confirming efficient data processing and Firebase updates.

3. Manual & Integration Tests

• Performed 100+ gesture recognition tests to validate ML accuracy.

• Conducted end-to-end system tests, ensuring seamless interaction between ESP32, MQTT, AWS, Firebase, and Flutter.

• Tested network stability, multi-user scenarios, and smart device compatibility.

Menu