International Journal of Innovative Research in                 Electrical, Electronics, Instrumentation and Control Engineering

A monthly Peer-reviewed & Refereed journal

ISSN Online 2321-2004
ISSN Print 2321-5526

Since 2013

Abstract: In recent years, the convergence of robotics, sensor technologies, and wireless communication has given rise to intelligent robotic systems capable of performing complex tasks with minimal human intervention. This research presents the development of a mobile-controlled smart robot designed to enhance interactivity, autonomous navigation, and real-time feedback in both academic and assistive environments. The robot is engineered to be controlled wirelessly via a Bluetooth-enabled mobile application, allowing users to direct its movement manually in indoor settings such as educational institutions, offices, or exhibition halls. A key feature of the system is its object detection capability, which is achieved through the integration of ultrasonic sensors. These sensors enable the robot to identify and avoid obstacles in its path, facilitating safe and autonomous operation in dynamic environments. Upon detecting an object, the robot responds accordingly by altering its course or halting movement, depending on the proximity and nature of the obstacle. This enhances the robot’s adaptability and ensures smooth navigation without the need for constant user supervision. In addition to its mobility and obstacle detection, the robot is equipped with a high-resolution display module—such as an OLED or Dot Matrix Display—which visually communicates contextual information to the user. This can include system status, object detection alerts, directional prompts, or predefined messages related to the robot’s environment. Complementing this visual interface, an audio output system powered by a DF Player Mini and speaker setup delivers pre-recorded voice messages or real-time alerts based on user commands and sensor feedback. These auditory cues not only increase the system’s accessibility but also contribute to a more immersive user experience. The robot’s central processing is handled by an Arduino-based microcontroller that coordinates input from various sensors and modules, processes control signals from the mobile application, and manages the output systems for display and sound. The integration of Bluetooth communication, object detection, audio narration, and visual display creates a robust, multi-modal interaction platform. This research aims to highlight the design, implementation, and performance evaluation of the proposed robotic system. The paper discusses the architecture of the robot, the interaction between its hardware and software components, and its potential applications in education, guided tours, customer service, and assistive technologies. The results demonstrate the feasibility and efficiency of using mobile-controlled robots with sensory feedback for dynamic, user-friendly, and intelligent navigation in structured indoor environments.

Keywords: Mobile-Controlled Robot, Object Detection, Bluetooth Communication, Ultrasonic Sensors, Audio Feedback, Display Interface, Arduino-Based System, Human-Robot Interaction, Smart Navigation, Embedded Systems.


PDF | DOI: 10.17148/IJIREEICE.2025.134102

Open chat