Driver Drowsiness Detection System


  • Bandi Manideep UG Scholar, Department of Computer Science and Engineering – Data Science, Geethanjali College of Engineering and Technology, Hyderabad, India
  • Hema Manogna Geethanjali College of Engineering and Technology, Hyderabad, India
  • Narlagiri Aditya Geethanjali College of Engineering and Technology, Hyderabad, India



Driver drowsiness, Road safety, Fatigue detection, Deep learning algorithms, Multimodal data, Real-time monitoring, Facial expressions, Eye movements, Preventive actions, Accident prevention


The driver drowsiness poses a significant risk on road safety, contributing to a substantial number of accidents worldwide. Fatigue, often difficult to detect, impairs a driver's cognitive abilities, reaction times, and decision-making skills, increasing the likelihood of accidents. Traditional methods of detecting drowsiness, such as subjective assessments or single-point measurements, lack accuracy and reliability, necessitating the development of innovative solutions to address this critical issue. The driver drowsiness detection system presents a pioneering approach in leveraging advanced technologies, including deep learning algorithms and multimodal data integration, to accurately detect signs of drowsiness and mitigate potential risks on the road. The system's architecture enables real-time monitoring of various parameters such as facial expressions, eye movements, and physiological cues, providing a comprehensive understanding of the driver's state. By analyzing these cues, the system can generate timely alerts to prompt corrective actions, thereby preventing potential accidents and ensuring road safety. With its modular design and scalability, the driver drowsiness detection system holds promise in revolutionizing road safety measures and protecting the lives of drivers and passengers worldwide.


Download data is not yet available.







How to Cite

B. Manideep, H. Manogna, and N. Aditya, “Driver Drowsiness Detection System”, IJRESM, vol. 7, no. 4, pp. 105–108, Apr. 2024, doi: 10.5281/zenodo.11004670.