Self-driving cars come with many unknowns and potential obstacles to safe driving but two recent studies set out to focus on one critical issue—the role of the human in the self-driving system.
The studies were published recently in Human Factors: The Journal of the Human Factors and Ergonomics Society.
- One paper suggests that drivers will respond best to verbal prompts, as opposed to sounds or visual displays, alerting them to driving conditions and the state of the vehicle (for example, low tire pressure).
- The other study assesses the level of drivers’ trust in the autonomous car by monitoring how often they interrupt a nondriving task to look at their surroundings.
In the first study, “Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride,” human factors researchers Michael A. Ness, Benji Helbein, and Anna Porter of Lafayette College, Easton, Pennsylvania, studied the usefulness of speech alerts to help drivers perceive and remember driving conditions while engaged in a nondriving activity.
Eighty-five undergraduate students performed a word search task while watching three driving simulation videos. Each scenario showed a routine driving condition. The participants were randomly assigned to one of three display conditions: sounds such as a jackhammer, indicating construction ahead; a visual display with text; and speech alerts such as “pedestrian” or “front hazard.”
After watching the videos, participants reported what they recalled about the driving scenario, how useful and how annoying the alerts were, and how confident they would feel if they had to resume control of the car at the moment the video stopped.
Participants who heard the speech alerts had better recall than those who were given the sound icons or visual displays. However, both audio alerts were rated as annoying, and studies show that annoying alerts have a tendency to be turned off.
The other study—”Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving”— is the work of Sebastian Hergeth, Lutz Lorenz, and Roman Vilimek of the BMW Group in Munich, and Josef F. Krems from Technische Universität Chemnitz, Germany.
In this study, 35 BMW Group employees ages 18 to 55 participated in a self-driving car simulation while engaging in a visually demanding nondriving task.
The driving scenario was a standard three-lane highway with a hard shoulder in which uneventful driving was periodically interrupted by incidents requiring the driver to take control. Although trust is difficult to quantify, drivers’ use of eye-tracking glasses enabled the researchers to capture data about how frequently participants looked away from the secondary task to observe the driving scene. Hergeth et al. then used these data to draw preliminary conclusions about drivers’ levels of trust in the simulated car’s automation.
The more the participants trusted the automation, the less frequently they looked at their surroundings. They were also more trusting of the car once they learned the system. Overall, more than half the drivers said they trusted the car more at the end than at the beginning of the trials. The researchers postulate that appropriate trust in automation is crucial for drivers to get the maximum benefit from self-driving vehicles.
Both research teams plan further investigations to assess how these areas of study can impact safety and how quickly and effectively drivers would take over the controls when necessary.
The Human Factors and Ergonomics Society is a scientific association for human factors/ergonomics professionals, with more than 4,800 members globally. HFES members include psychologists and other scientists, designers, and engineers, all of whom have a common interest in designing systems and equipment to be safe and effective for the people who operate and maintain them.
Source: Human Factors and Ergonomics Society