When it comes to the type of people needed to design, build and test autonomous vehicle technology, you’d be forgiven for thinking you just need hardware and software engineers. After all, hardware engineers are the ones who figure out how the car ‘sees’ the outside world through a plethora of sensors as well as design the hardware integration necessary to allow the car’s on-board computer system to interface with the accelerator pedal, brakes, steering and of course, lights. Software engineers meanwhile, design the algorithms needed to interpret the data from those sensors and teach the car to make decisions based on a set of pre established rules so that it can safely drive down the road.
But as Nissan explains in a recent exposition of the people behind its soon-to-launch ProPILOT semi-autonomous driver assistance feature, technical sophistication and driving competency is only a part of the puzzle when it comes to bringing a self-driving car to market. In short, a self-driving car has to be intuitive to use too.
The key to that intuitiveness is an understanding of how we as humans interact with the world around us as well as understanding all of the subtle signals that we send to others when behind the wheel. Without that knowledge, an autonomous driving system might be able to safely drive down the road, avoiding accidents along the way, but it will still behave very much as a machine.
In order for other road users to feel at ease with such a vehicle, an autonomous vehicle needs to behave in ways that feel comfortable for the humans around it.
Which is one of the reasons Dr. Maarten Sierhuis, expert of human-robot interactions and former NASA engineer, is head of the Nissan-Renault Research Center in Silicon Valley California. Indeed, while there are plenty of traditional automotive engineers from both Nissan and Renault now employed at the facility, a large number of the staff are experts in traditionally non-automotive spheres like psychology, mathematics and anthropology.
Sierhuis is one of those such people, having received his Ph.D in artificial intelligence from the University of Amsterdam before joining NYNEX Corp. in the early 1990s. From there, he moved to NASA, working as a researcher designing human-robot interactions and collaborative intelligent systems designed to make space exploration easier. Now at Nissan, he’s helping remind everyone that autonomous vehicles need to operate as almost a seamless extension of the human behind the wheel.
Explaining that making sure autonomous vehicle systems are “good for people” is a top priority, Sierhuis said that his current hour-long commute is at least part of the reason behind his drive to see Nissan’s autonomous vehicles in the marketplace.
“The only thing I really enjoy about commuting is that that’s the time that I have to myself,” he admitted. “[If the car can drive part of the way,] that would be to me the best thing that can happen.”
Alongside Sierhuis at the Renault-Nissan Research Centre, other experts offer similarly interesting skillsets to Nissan’s autonomous vehicle program. There’s a team of psychologists and anthropologists whose job it is to try interpret the hidden signals we all give each other when on the road that a computer system won’t automatically pick up, such as making eye contact with other road users. There’s a team whose job it is to make sure that Nissan’s autonomous driving system makes occupants aware of a change of speed or direction before it occurs to minimize disorientation and disconnection with the outside world. And there’s a team focusing on making autonomous learn how specific users prefer to take corners, accelerate, or brake with a view to making the computer-aided driving experience more compelling and less disconcerting.
The first Nissan vehicle to benefit from this philosophy is the Japanese-market next-generation Nissan Serena minivan, which debuts next month with Nissan’s first generation ProPILOT technology as standard.
Similar to Tesla’s Autopilot functionality, ProPILOT is designed to keep the car in the centre of its lane at all times, following the curve of the road and responding dynamically to road signs and changes in traffic flow ahead. Activated using a button on the steering wheel, ProPILOT requires the driver to keep their hands lightly on the wheel at all times, ensuring that the driver remains ready to take over should the need arise.
Later this year, ProPILOT is expected to be introduced to other vehicles in the Nissan family in its home market of Japan, including the Nissan LEAF electric hatchback. It’s not clear yet when this technology will be debuted outside of Japan, but Nissan says its ProPILOT hardware and software will be rolled out across many of its vehicles in the next few years, culminating in the launch of a fully-autonomous vehicle some time by 2018.
Do you think Nissan is right to be focusing on more than just the technicalities of self-driving? What interactivity would you like in a self-driving car — and what things do you think it’s hard for computers to understand about everyday driving?
Leave your thoughts in the Comments below.
You can also support us directly as a monthly supporting member by visiting Patreon.com.