Abstract: This paper presents a gesture-controlled music player system that interprets real-time hand gestures to control media functions such as play, pause, next/previous track, and volume adjustment. The project aims to deliver a touchless user experience, offering intuitive control of digital media via simple hand movements detected through a webcam. The system utilises computer vision technologies, including Opencv, MediaPipe, and Python’s gesture recognition algorithms. Designed with accessibility and hygiene in mind, this music player is particularly relevant in environments where hands-free interaction is preferred or necessary. Testing showed gesture accuracy of over 90% under good lighting conditions, confirming the system's practical usability. Future enhancements will focus on integrating machine learning for personalised gesture mapping and expanding crossplatform support.
Call for Papers
Rapid Publication 24/7
May 2025/June 2025
Submission: eMail paper now
Notification: Immediate
Publication: Immediately with eCertificates
Frequency: Monthly