AirWave: Hands-Free Cursor Navigation with Face and Voice


Authors : Thippeswamy G; Akash R; Ankit Suresh Savalagi; Dayanidhi M K; Dileep R

Volume/Issue : Volume 10 - 2025, Issue 5 - May


Google Scholar : https://tinyurl.com/33j5x9cb

DOI : https://doi.org/10.38124/ijisrt/25may1183

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Context: Hands-free cursor navigation is considered indispensable for improving accessibility for physically disabled individuals. Currently, the control of mouse actions is based on facial gestures and voice commands in most existing systems. However, these solutions often face constraints such as a high sensitivity to environments, user fatigue, and reliance on advance hardware. Therefore, the following discussion will describe a lightweight, yet scalable system optimized for real-world conditions.  Objectives: The project AirWave develops a hands-free cursor navigation system through facial gestures and voice commands offline. This is focused on real-time performance while minimizing the requirements for hardware on the system, thus becoming accessible and user-friendly for people with disabilities. This application aims at relating facial gestures like head tilts and blinks to cursor actions and integrating voice commands for advanced controls.  Method: The system uses Dlib for facial landmark detection, OpenCV for video processing, and Vosk for offline speech recognition. The system reads in real-time video input from a webcam, detects facial gestures, and maps these to cursor movements using PyAutoGUI. Voice commands predefined trigger mouse actions, such as opening applications or scrolling.  Result: The study conducted on the research papers provided critical insights into the current advancements and limitations of hands-free cursor navigation systems. The findings highlight the need for larger datasets and more sophisticated models to improve the accuracy.  Conclusion: AirWave demonstrates the potential for accessible hands-free computing by addressing environmental sensitivity and hardware constraints. It provides a scalable, efficient solution, with future scope in multilingual commands, adaptive recognition, and IoT integration

Keywords : Facial Gesture Recognition, Voice Command Integration, Eye Aspect Ratio (EAR), Mouth Aspect Ratio (MAR), Contrast Limited Adaptive Histogram Equalization (CLAHE).

References :

  1. “Hands-Free Mouse Control Using Facial Feature,” IEEE Conference Publication, 2024. [Online]. Available: IEEE XPLORE.
  2. “Gesture Based Mouse Control,” IEEE Conference Publication, 2018. [Online]. Available: IEEE XPLORE.
  3. “Mouse Cursor Control Using Facial Movements,” IEEE Conference Publication, 2022. [Online]. Available: IEEE XPLORE.
  4. “Identifying Facial Gestures to Emulate a Mouse,” IEEE Conference Publication, 2017. [Online]. Available: IEEE XPLORE.
  5. “Mouse Cursor Movement and Control Using Eye Gaze,” IEEE Conference Publication, 2023. [Online]. Available: IEEE XPLORE.
  6. “Gesture-Driven Virtual Mouse with a Voice Assistant,” 2023 6th International Conference on Recent Trends in Advance Computing (ICRTAC).
  7. “Computer Navigation Using Audio and Video Aid for Amputees and Parkinson’s Patients,” 2022 5th International Conference on Advances in Science and Technology (ICAST).
  8. “Comparative Analysis of Hands-Free Mouse Controlling Based on Face Tracking,” 2021 13th International Conference on Information & Communication Technology and System (ICTS).
  9. “Virtual Mouse Using Hand and Eye Gestures,” 2023 International Conferences on Data Science, Agents, and Artificial Intelligence (ICDSAAI), IEEE. DOI: 10.1109/ICDSAAI59313.2023.10452550.
  10. “An Analysis on Virtual Mouse Control using Human Eye,” 2024 5th International Conference on Image Processing and Capsule Networks (ICIPCN), IEEE. DOI: 10.1109/ICIPCN63822.2024.00045.
  11. “Mouse Cursor Control with Eye Gestures,” 7th International Conference on Inventive Computation Technologies (ICICT 2024), IEEE.
  12. “Face Gesture Based Virtual Mouse Using Mediapipe,” IEEE 8th International Conference for Convergence in Technology (I2CT 2023).
  13. “Human Computer Interaction Based Eye-Controlled Mouse,” 3rd International Conference on Electronics Communication and Aerospace Technology (ICECA 2019), IEEE.
  14. “Cursor Control Based on Eyeball Movement Using Deep Learning,” 2023 Intelligent Computing and Control for Engineering and Business Systems (ICCEBS).
  15. “Computer Mouse Control Using Iris Tracking: An Accessible and Cost-Effective Approach for People with Mobility Disabilities,” 2023 42nd IEEE International Conference of the Chilean Computer Science Society (SCCC).
  16. “Mouse Cursor Controlled by Eye Movement for Individuals with Disabilities,” 2023 7th International Conference on Intelligent Computing and Control Systems (ICICCS).
  17. “Controlling Mouse Motions Using Eye Tracking Using Computer Vision,” Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020), IEEE.
  18. “Control the Movement of Mouse Using Computer Vision Technique,” Proceedings of the Sixth International Conference on Electronics, Communication, and Aerospace Technology (ICECA 2022), IEEE.
  19. “EyeGaze Control: Enhancing Mouse Cursor Precision Through Eyeball Movements,” 2024 IEEE Students Conference on Engineering and Systems (SCES).
  20. “Gesture and Voice Controlled Virtual Mouse for Elderly People,” 2024 IEEE International Conference on Networking and Communications (ICNWC).
  21. “Facial Movement and Voice Recognition Based Mouse Cursor Control,” 2023 IEEE International Conference on Smart Electronics and Communication (ICOSEC).
  22. “An Efficient Mouse Tracking System Using Facial Gestures,” 2022 8th International Conference on Advanced Computing and Communication Systems (ICACCS).
  23. “Assisting the Differently Abled Person Using Eye Mouse,” 2024 9th International Conference on Science Technology Engineering and Mathematics (ICONSTEM).
  24. “Computer Cursor Control Using Eye and Face Gestures,” 2020 11th ICCCNT.
  25. “Eyeball-Based Cursor Movement Control,” International Conference on Communication and Signal Processing, July 28-30, 2020, India.
  26. “BLINK-CON: A Hands-Free Mouse Pointer Control with Eye Gaze Tracking,” 2021 IEEE Mysore Subsection International Conference (MysuruCon).
  27. “Touchless Head-Control (THC): Head Gesture Recognition for Cursor and Orientation Control,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 30, 2022.
  28. “Facial Movements Based Mouse Cursor Control for Physically Disabled Individuals,” International Journal of Engineering & Science Research (IJESR), Sep 2023, Vol-13, Issue-3.
  29. “Hands-Free Gesture and Voice Control for System Interfacing,” International Journal on Recent and Innovative Trends in Computing and Communication (IJRITCC), ICEMTE-2017.
  30. “Hands-Free Gesture and Voice Control for System Interfacing,” International Conference on Emanations in Modern Technology and Engineering (ICEMTE-2017).
  31. “A Prototype System for Controlling a Computer by Head Movements and Voice Commands,” Technical Report, 2017.
  32. “A Overview on Designing of Hands-Free Mouse Pointer for Motor Impairment People Using Motion Tracking and Speech Recognition,” International Journal of Engineering Research & Technology (IJERT), June 2013, Volume 2, Issue 6.
  33. “Hands-Free PC Control: Controlling of Mouse Cursor Using Eye Movement,” International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012.
  34. “Cursor Control with Facial Gestures Using CNN,” Technical Report, Tribhuvan University, May 2023.
  35. “Face Gesture and Speech Based Virtual Mouse and Virtual Assistant,” International Journal of Creative Research Thoughts, Volume 12, Issue 5, May 2024.
  36. “Mouse Cursor Control Using Facial Movements,” International Journal of Creative Research Thoughts (IJCRT), Volume 12, Issue 4, April 2024.
  37. “Computer Cursor Tracking Using Eye Movement, Gesture Sign Language, and Voice Commands,” International Journal of Research Publication and Reviews, Volume 5, Issue 1, January 2024.
  38. “Facial-Expression Based Mouse Cursor Control for Physically Challenged Individuals,” International Research Journal of Engineering and Technology (IRJET), Volume 10, Issue 4, April 2023.
  39. “Mouse Cursor’s Movements Using Voice Controlled Mouse Pointer,” International Journal of Computer Applications, Volume 71, Issue 7, May 2013.
  40. “Low-Cost Human–Machine Interface for Computer Control with Facial Landmark Detection and Voice Commands,” Sensors 2022, Volume 22, Article 9279 (MDPI).
  41. “Cursor Tracking by Sensory Organs for Handicapped People,” International Journal of Advance Research, Ideas, and Innovations in Technology, Volume 5, Issue 6, 2019.
  42. “Human-Computer Interaction Based Head-Controlled Mouse,” MZUJHSS, Volume X, Issue 1, June 2024.
  43. A Comprehensive Review of Face Recognition Techniques, Trends, and Challenges  H. L. GURURAJ, 2 July 2024
  44. "Multimodal Biometric Human Recognition for Perceptual Human-Computer Interaction," IEEE Transactions on Systems, Man, and Cybernetics—Part C: Applications and Reviews, vol. 40, no. 6, pp. 740–751, Nov. 2010.
  45. "Comparing the Use of Single Versus Multiple Combined Abilities in Conducting Complex Computer Tasks Hands-Free," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 9, pp. 1736–1745, Sep. 2018.
  46. "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 478–500, Mar. 2010.
  47. "A Multi-Gesture Interaction System Using a 3-D Iris Disk Model for Gaze Estimation and an Active Appearance Model for 3-D Hand Pointing," IEEE Transactions on Multimedia, vol. 13, no. 3, pp. 513–526, Jun. 2011.
  48. "Smart Wheelchair Based on Eye Tracking," in Proceedings of the Biomedical Engineering International Conference (BMEiCON), Laung Prabang, Laos, 2016, pp. 1–5.
  49. "Facial Landmark-Based Cursor Control and Speech-to-Text System for Paralyzed Individuals," in Proceedings of the IEEE International Conference on Sustainable Computing and Data Communication Systems (ICSCDS), Erode, India, 2023, pp. 1–8.
  50. "Video Face Detection Based on Improved SSD Model and Target Tracking Algorithm," Journal of Web Engineering, vol. 21, no. 2, pp. 135–152, 2022.
  51. "HeadTrack: Real-Time Human Computer Interaction via Wireless Earphones," IEEE Journal on Selected Areas in Communications, vol. 42, no. 4, pp. 1014–1025, Apr. 2024.

Context: Hands-free cursor navigation is considered indispensable for improving accessibility for physically disabled individuals. Currently, the control of mouse actions is based on facial gestures and voice commands in most existing systems. However, these solutions often face constraints such as a high sensitivity to environments, user fatigue, and reliance on advance hardware. Therefore, the following discussion will describe a lightweight, yet scalable system optimized for real-world conditions.  Objectives: The project AirWave develops a hands-free cursor navigation system through facial gestures and voice commands offline. This is focused on real-time performance while minimizing the requirements for hardware on the system, thus becoming accessible and user-friendly for people with disabilities. This application aims at relating facial gestures like head tilts and blinks to cursor actions and integrating voice commands for advanced controls.  Method: The system uses Dlib for facial landmark detection, OpenCV for video processing, and Vosk for offline speech recognition. The system reads in real-time video input from a webcam, detects facial gestures, and maps these to cursor movements using PyAutoGUI. Voice commands predefined trigger mouse actions, such as opening applications or scrolling.  Result: The study conducted on the research papers provided critical insights into the current advancements and limitations of hands-free cursor navigation systems. The findings highlight the need for larger datasets and more sophisticated models to improve the accuracy.  Conclusion: AirWave demonstrates the potential for accessible hands-free computing by addressing environmental sensitivity and hardware constraints. It provides a scalable, efficient solution, with future scope in multilingual commands, adaptive recognition, and IoT integration

Keywords : Facial Gesture Recognition, Voice Command Integration, Eye Aspect Ratio (EAR), Mouth Aspect Ratio (MAR), Contrast Limited Adaptive Histogram Equalization (CLAHE).

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe