Text Sabrina Künz Photos Georg Roske
There’s a light rain coming down, and I’m nervous as I approach the site of today’s driving experience. The nerves? I’m worried about today’s activity: I am going to be filmed and analyzed by professional traffic experts while driving.
Volkswagen AG and 30 other companies are involved in the German research initiative UR:BAN. It began in March 2012 and will run over the course of four years. Those involved with the project work on developing and implementing intelligent and cooperative driver-assistance systems for the urban traffic of the future. The systems help people drive better in complex urban traffic scenarios.
Dr. Julia Drüke, who heads a project called Human-Machine Interaction for Urban Environments and works in corporate research at Volkswagen AG, is introducing me to the preliminary findings today.
What does human-machine interaction (HMI) mean? In short, it is the way my car communicates with me, the interface between the technology and its user.
“Urban driving is very dynamic due to complex situations, distractions, mixed road use, short times to make decisions and a large number of road users,” says Drüke. “Urban driver assistance systems have to process a lot of information, filter it and pass it on to drivers in a form that helps relieve pressure without distracting them.”
This is where the UR:BAN initiative fits. Its aim is to bundle driver assistance systems into a package that helps supports the driver. The key question is how much and what information drivers need to adapt their driving styles in specific ways.
“A useful warning doesn’t mean that all signals are switched on simultaneously — they have to be effective,” Drüke says.
The researchers test a combination of acoustic, visual and haptic signals to convey information or warnings. They then develop a modular HMI tool-kit system based on the results. It is categorized according to driving situation, desired action (or reaction) and urgency.
The modular HMI system differentiates between different driving situations, including appropriate reactions ranging from recommended action to control to warning and intervention. The modular tool-kit system encompasses clear design guidelines regarding the assistance systems’ appearance and placement. It describes visual output media such as the instrument cluster and heads-up display; acoustic signals, such as sounds or speech; visual cues such as LED bar or indicators; and haptic signals — for instance a steering movement, a jerk on the brake, or an emergency steering maneuver.
The objective of the HMI modular tool-kit system is to function in a generic, modular and expandable manner. “Generic” means that in the future, assistance systems will be consolidated and synchronized with each other. “Modular” means that not every car must have the identical mix of assistants. The components have to be expandable because technical advances continue to be developed.
I get into the dynamic driving simulator to experience the results of the research. The car is mounted on a hexapod, lifted about 2½ yards into the air and subjected to vigorous shaking to simulate a car’s movements. From the outside, the whole setup looks similar to a roller coaster ride, but inside it feels astonishingly realistic. The scenery is projected as a 270-degree panorama and corresponding visual views are sent to screens in the side and rearview mirrors, aiding in the realism.
The first thing I test is the lane-change assistant feature called Side Assist. A delivery truck crawls along in front of me and I try to pass carefully, but the car behind me swerves out of the lane and floors the accelerator. Available Lane Assist1, Side Assist2 and Blind Spot Monitor34 work together in the simulator to send me a signal — an orange LED on the left side view mirror — that it’s not clear to pass and provides light counter steering. When the road is clear, the system helps me steer past the vehicle. Not bad at all.
My pulse is now normal. I’m cruising around the virtual, pedestrian-filled city. Then I catch something dashing into my field of vision. Before I know it, a warning signal sounds and the red LED bar lights up. I step on the brakes hard and come to a stop right in front of a man.
These situations show how modular HMI tool-kit components work. As long as there is enough time and the situation remains unthreatening, the vehicle merely provides information. If I get into critical situations in which my reaction time would be too slow, the system intervenes to help prevent a collision. I am given clear, specific, practical support.
What comes next for UR:BAN? Implementing assistant systems on the roads. Says Drüke, “We are very satisfied with the progress so far.”
The project group is drawing up a final version of the design guidelines to help improve current and future human-machine-interaction development.