Who (What)’s Driving and When? Two New Studies Look at the Human Factors of Self-Driving Cars Through the Eyes and Ears of Drivers
Wednesday, February 24, 2016
For all the media attention they’ve been getting lately, self-driving cars come with many unknowns and potential obstacles to safe driving. A critical issue is the relative lack of research on the role of the human in the system. This human factors component may represent more daunting challenges than technological, legal, and security concerns of self-driving cars.
Advancing the state of knowledge about human factors aspects of autonomous passenger vehicles are two studies published recently in Human Factors: The Journal of the Human Factors and Ergonomics Society. One paper assesses the level of drivers’ trust in the autonomous car by monitoring how often they interrupt a nondriving task to look at their surroundings. This study presents the first empirical evidence making this connection.
The other study suggests that drivers will respond best to verbal prompts, as opposed to sounds or visual displays, alerting them to driving conditions and the state of the vehicle (for example, low tire pressure).
“Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving” is the work of Sebastian Hergeth, Lutz Lorenz, and Roman Vilimek of the BMW Group in Munich, and Josef F. Krems from Technische Universität Chemnitz, Germany.
In this study, 35 BMW Group employees ages 18 to 55 participated in a self-driving car simulation while engaging in a visually demanding nondriving task. The driving scenario was a standard three-lane highway with a hard shoulder in which uneventful driving was periodically interrupted by incidents requiring the driver to take control. Although trust is difficult to quantify, drivers’ use of eye-tracking glasses enabled the researchers to capture data about how frequently participants looked away from the secondary task to observe the driving scene. Hergeth et al. then used these data to draw preliminary conclusions about drivers’ levels of trust in the simulated car’s automation.
The more the participants trusted the automation, the less frequently they looked at their surroundings. They were also more trusting of the car once they learned the system. Overall, more than half the drivers said they trusted the car more at the end than at the beginning of the trials. The researchers postulate that appropriate trust in automation is crucial for drivers to get the maximum benefit from self-driving vehicles.
In “Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride,” human factors researchers Michael A. Nees, Benji Helbein, and Anna Porter of Lafayette College, Easton, Pennsylvania, studied the usefulness of speech alerts to help drivers perceive and remember driving conditions while engaged in a nondriving activity.
Eighty-five undergraduate students performed a word search task while watching three driving simulation videos. Each scenario showed a routine driving condition. The participants were randomly assigned to one of three display conditions: sounds such as a jackhammer, indicating construction ahead; a visual display with text; and speech alerts such as “pedestrian” or “front hazard.”
After watching the videos, participants reported what they recalled about the driving scenario, how useful and how annoying the alerts were, and how confident they would feel if they had to resume control of the car at the moment the video stopped. Participants who heard the speech alerts had better recall than those who were given the sound icons or visual displays. However, both audio alerts were rated as annoying, and studies show that annoying alerts have a tendency to be turned off.
Both research teams plan further investigations to assess how these areas of study can impact safety and how quickly and effectively drivers would take over the controls when necessary.
To receive a copy of these articles for media-reporting purposes, contact HFES Communications Director Lois Smith (310/394-1811; email@example.com).
The Human Factors and Ergonomics Society is the world’s largest scientific association for human factors/ergonomics professionals, with more than 4,500 members globally. HFES members include psychologists and other scientists, designers, and engineers, all of whom have a common interest in designing systems and equipment to be safe and effective for the people who operate and maintain them. “Human Factors and Ergonomics: People-Friendly Design Through Science and Engineering.”