DRIVER-CONDITION DETECTION USING A THERMAL IMAGING
CAMERA AND NEURAL NETWORKS |
Shinji Kajiwara |
Kindai Univers |
|
|
|
|
ABSTRACT |
Autonomous vehicles aim to improve driving safety and comfort. In autonomous car SAE level-3 operations,
it is necessary to determine whether the driving authority can be transferred from the computer to the driver. The driver must
be awake and sufficiently alert to switch to manual driving operation. Physiological measurement methods require sensors
that are in contact with the human body. These sensors are annoying, frustrating, and inconvenient for the driver. The purpose
of this study is to determine a driver’s condition using eye-closing time and yawning frequency by a visible-light camera and
a thermal camera. Eye-closings and yawns were detected using an appropriate non-contact detector with a visible camera and
thermal camera. When a visible-light camera was used, the driver state recognition rate was 90 % or higher when the driver’s
surroundings were brightly lit, but the recognition rate decreased significantly to approximately 4 % when the surroundings
were dark. Using a thermal camera, the face recognition rate was 74 % under bright and dark conditions. For the thermal
images, DLib and OpenCV could be performed well. Therefore, the DLib and thermal image combination could be used for a
reliable driver drowsiness detection task. |
Key Words:
Fatigue detection, Thermal image camera, Autonomous car, Image processing, Facial landmark,
Eye-closing, Yawn |
|