Humanizing Automated Vehicles – Allowing for the “Jerk” Response
SEE ALSO: TACH Autonomous Vehicle Library - Opinion, News, Video, Images and 'Splaining
SEE ALSO: Autonomous Driving May Kill The Automobile Industry And Our Freedom Of Mobility
SEE ALSO: Why Driving Matters: Lutz, Hagerty, Carini Debate Future of Driving at Inaugural Hagerty Town Hall +VIDEO(1 Hour 11 Minutes)
CAR Management Briefing Seminar Report
From Martha Hindes
The Auto Channel
Traverse City MI August 1, 2018 -- If anything is becoming clear at the 15th annual CAR Management Briefing Seminar at Traverse City, Michigan, it's that nothing is clear at all. And it's likely not to settle down for at least a few decades. That, of course, relates to our driving future and whether the next few generations of Americans might still relish getting a driver’s license at age 16 or whether the whole concept will feel like a bit of past history that would seem amusing if it weren’t so antiquated.
What’s emerged at the annual vehicle industry self-examination held each summer in Traverse City, Michigan, is a sense that someone had expected to set off a 4th of July sparkler and discovered instead that it was a multi-layered firework that keeps on spewing additional patterns of complex explosions with no evidence they will stop. Every time a new problem or concern about those Autonomous Vehicles (AVs) or so-called “self driving” vehicles governed by Artificial Intelligence (IT) the auto industry is rushing into developing appears to point to a potential solution in progress, something else comes along to disrupt any sense of security.
Take issues facing the vehicle industry now. Whether it's a supplier earning less than a billion dollars a year finding that survival means being acquired or the uncertainty of whether to make a vehicle dashboard essentially one wide computer screen, every possible aspect of designing, engineering, manufacturing and ultimately finding the perfect user for one's product is at best becoming a global crap shoot. And adding the element of autonomy magnifies that exponentially.
Take for example an intersection with people walking, cars turning and someone in another car running a stop light at exactly the wrong time. In real-life human-to-human reactions, one driver will wave the walkers to finish crossing the street, while another will stop and let the traffic light violator continue so an accident is avoided. That’s allowing for the “jerk” response, according to Dr. Marteen Sierhaus, keynote speaker at Wednesday’s session.
A 12-year veteran of NASA, where he created computer language to communicate with the International Space Station and developed an autonomous system for interacting with spacewalking astronauts, he now leads autonomous vehicle researchers at Nissan in California. Dr. Sierhaus knows well the foibles of how human beings function and their unpredictability.
While traffic laws should be able to resolve such issues, that simply doesn’t happen, he says. And a lot of that varies with location, sometimes in an area of just a few blocks and the local culture. All provide variants of what Artificial Intelligence will need to learn to take the technology beyond the development stage and into a practical reality where no humans are collateral damage as it evolves.
“In Amsterdam bicyclists rule. In San Francisco pedestrians rule,” he says.
Learning to react as a human would is only one of the challenges for AI. Every conceivable aspect is being explored for a time a human can get into the driver’s seat – of what used to be a driver’s seat – and feel comfortable in knowing it will do all the work, safely and efficiently without feeling there is impending disaster. Adding yet another traffic fatality is exactly what self driving vehicles are being designed to avoid. Instead, it should be seamless autonomous mobility, with much of the AI function to coordinate human-robot teamwork now being developed in the cloud. That’s one of necessary steps for people to start accepting the concept of driverless cars.
Part of development is having systems learn the fine points of human reactions, such as how long a person will wait for the car in front to move before honking the horn. Training an AV when to honk is only one of the aspects of humanizing the technology.
Some unanswered questions:
Who is responsible in the event of an accident, the non-human driver or the vehicle manufacturer?
Whose insurance would cover an accident, or who would be responsible for an injury that (unlike the woman in Arizona killed during a Tesla test phase) actually was in total control of the vehicle? At Nissan the test policy is “eyes on, hands free (of steering wheel), then eyes off, hands free” as they learn.
Do car buyers actually care about the technology controlling what they drive, or do they want what works for them with all the bells and whistles while the vehicle does the work?
When will infrastructure catch up with the changes necessary to keep self driving safe and predictable so a car following a lane marker doesn’t mistakenly read a road repair strip instead? Don’t expect a quick answer, according to Dr. Sierhuis. “The last thing to change is the infrastructure.”
Among potential things to ask is a favorite of mine I haven’t heard anyone address yet: Who gets the ticket if an AV in true human fashion runs a red light?
Copyright 2018, Martha Hindes, Automotive Bureau, All Rights Reserved