Navigation and service

The empathetic car : Date:

We have long since become used to cars that think for themselves in traffic, stay in a lane, maintain speed or brake when necessary. However, one thing is often neglected in the research and development of autonomous vehicles: the feelings and unconscious behaviour of humans. An interdisciplinary team of researchers from the fields of neurotechnology, networked mobility and robotics at Saarland University of Applied Sciences is working on the question of how autonomous vehicles can better respond to the emotions of their occupants. The simulator was built with BMBF funds as part of the FH-Invest project MIND2CAR, co-financed by Saarland and industry partners.

Professor Daniel Strauss is very well versed in human–machine interfaces: for years, he has been working in the Systems Neuroscience & Neurotechnology Unit (SNNU) of htw saar with different variants of such interfaces, for example in order to research the acoustic distractability of car drivers or to test new robotics concepts.

A young woman in a car seat with an EEG cap is holding the steering wheel and looks at the screen of the driving simulator
Neuroergonomic investigation of a takeover scenario (automation level 3) inside the MIND2CAR research platform. © SNN-Unit, htw saar

But the MIND2CAR simulator is something very special even for the neuroscientist and his team: a one-of-a-kind device for investigating the interaction between human and vehicle, especially in automated driving. Thanks to new neurotechnological methods, MIND2CAR can “read” the thoughts or feelings of its occupants and, for instance, optimally adapt the exchange of information to their attention and to the traffic situation. But the project team is focusing on more than just driving safety: MIND2CAR will also be used to investigate how people with neurological limitations due to age and disease can be supported by assistance and automation functions. Due to the combination of digitally networked mobility and psychophysiological data, data security also plays an important role in working with the simulator.

MIND2CAR was set up over several years as part of an FH-Invest project of the same name, with the participation of an international consortium. “This is not an off-the-peg device,” emphasises project head Strauss: “This platform is extremely important for us because it enables us to conduct new and innovative research. We can use it to replicate all five stages of autonomous driving and explore completely new concepts.” At level 5, MIND2CAR could even become an “empathic machine” that understands the emotional state of the occupants and adapts interaction strategies accordingly.

A safe space even for far-out ideas

Those who are already somehow uncomfortable with cars that have an automatic parking function may be quite spooked by the idea of a car that can empathise with the driver. But Daniel Strauss has good reasons for pushing the envelope of the simulator to the limits of what is imaginable: “Self-driving cars are a hot topic; people are mainly interested in the higher levels of automation – even if they are not yet fully developed. In our simulator, we can test everything without any danger,” explains the scientist. “In doing so, we can also try out far-out concepts that may never be implemented. Only then can we find out whether these concepts make sense at all.”

The five stages of autonomous driving

Level 1: Assisted driving – the driver controls the vehicle and always has an eye on traffic but is supported by assistance systems such as cruise control or distance control.

Level 2: Semi-automated driving – the driver controls the vehicle and always has an eye on traffic, but the car can already perform some tasks by itself. For example, on the motorway, the car can simultaneously keep in lane and accelerate or brake.

Level 3: Highly automated driving – the driver may temporarily turn away from traffic but must take the wheel again when the need arises.

Level 4: Fully automated driving – the driver can transfer control completely to the vehicle and, for example, sleep or read, but must always be fit to drive and can also take the wheel themselves.

Level 5: Autonomous driving – there is no longer a driver; the occupants of the car are only passengers, and the car can also drive completely without occupants.

The simulator is thus a safe space for all kinds of research ideas related to self-driving cars, including very pragmatic approaches to problems such as motion sickness, which many people encounter when they try to read in a moving car: the sense of balance registers the movement of the car, but the eyes report standstill because they are directed at an unmoving object. The brain is confused, and the body begins to feel sick – a very unfavourable effect if self-driving cars are to serve as mobile workplaces at some point in the future.

Strauss and his team are therefore investigating how the vehicle can detect the first signs of nausea in its occupants and adapt its driving behaviour accordingly. To this end, they send their test persons into the simulator with an EEG cap to measure their brain waves. However, since such caps are obviously not suitable for everyday use, the scientists are simultaneously working on a system of cameras and sensors that will be compared with the EEG data until it delivers similarly reliable results.

Four young researchers sitting at screens in front of a large rectangular box in turquoise and white
Exterior view of the MIND2CAR research platform © SNN-Unit, htw saar

The car communicates with its environment

Another pragmatic research topic for MIND2CAR is the so-called takeover, which is used in automation level 3: at this stage, the vehicle largely drives on its own, but asks its driver to take the wheel in critical situations. Since this concept relieves the driver from their responsibility only to a very limited extent, it is rather absurd for Daniel Strauss anyway – at least as long as the car cannot register what the human is paying attention to.

“Imagine yourself sitting in a level-three car, reading your emails,” he says. “You leisurely scroll down the list, but then suddenly there is an email that is emotionally demanding – perhaps because it contains annoying, sad or very pleasant news. At a moment like this, the car will have a much harder time getting your attention back.” Therefore, the car needs to know how distracted the person is at that moment and then has to match that information to the current traffic situation so that it can initiate the takeover in the best possible way. In order to achieve this, MIND2CAR not only has neurosensor technology, but also advanced simulation of networked traffic events, which takes into account communication from the vehicle to other vehicles or with the infrastructure.

Regarding announcements in general: how does a stressed or irritated person react when their car greets them effusively in the morning? At what point are assistance systems perceived as so annoying that one would rather switch them off? These are questions that the automotive industry is already asking itself today – and to which MIND2CAR can provide answers. But the possibilities of the simulator are far from exhausted. For Daniel Strauss, things always get exciting when health comes into play: how can a vehicle help people with physical or mental disabilities to cope with traffic? And does it make sense if your own car sends you to the doctor because it thinks you look sick? Of course, health monitoring of this type is still a dream of the future – but it is also one of those ideas that can already be thought out and tested today in the safe space of MIND2CAR.

Robots and virtual control panels

In addition to the neurosensor technology, the interaction possibilities between human and machine in MIND2CAR are also quite futuristic: a collaborative robot, several large-format touchscreens and so-called virtual haptics. With this technology, virtual control panels and buttons can be projected into a space using ultrasound. If you reach for them with your hand, you can actually feel and operate the button, so you don’t even have to look to find it. Similar to the classic mechanical knobs in your own car, you can find your way around intuitively – a great advantage over modern touch displays, which offer many functions but do not give the sense of touch any clues for orientation.

The project to build the simulator was completed in December 2019, and the project team achieved one of its most important goals: the new simulator was supposed to combine ideas and results from previous research, but in the end it was meant to be able to do much more than all existing concepts. The virtual buttons, the innovative neurosensor technology and the fact that MIND2CAR can now also map complete driving-test fields in Saarland, where htw saar tests autonomous vehicle concepts – all of this can be taken as proof of the uniqueness of the device.

And yet the real work is just beginning: many ideas have been newly developed during the project and are now gradually being implemented in research work. There is still a long way to go on the path to the empathetic car – but Daniel Strauss and his team are ready to take the journey. Even if some ideas may turn out to be too crazy in the end.