When the history of autonomous vehicles is written, someone who will figure as being among the most instrumental in its development and execution is undoubtedly going to be Chris Urmson, a man who has been involved with the technology for more than a decade—and in the time frame of autonomous driving tech, that’s a lot longer than the chronology might indicate.
Back in 2007 Urmson, who had obtained a Ph.D. in Robotics from Carnegie Mellon in 2005, was on the Tartan Racing team—he was the director of Technology—the Carnegie Mellon entry that won the DARPA Urban Challenge. As that undertaking is described in DARPA documentation: “The Urban Challenge features autonomous ground vehicles maneuvering in a mock city environment, executing simulated military supply missions while merging into moving traffic, navigating traffic circles, negotiating busy intersections and avoiding obstacles.”
The team ran a highly sensored and instrumented Chevy Tahoe to victory in Victorville, California.
“Military supply missions” aside, his know-how and experience led him to Google, where he was the chief technology officer for what was then the Google Self-Driving Cars project. He was with Google from February 2009 until August 2016, a period when significant groundwork was done on developing the technology. (In May 2014 the company showed a prototype of what has become known as the “Google Car,” which, for better or worse, will probably go into that aforementioned history as something akin to the Model T of autonomy.)
A few months after leaving Google, the same month that the Google Self-Driving Car project became Waymo, Urmson became one of the co-founders (and CEO) of Aurora Innovation. Joining him in founding the company were Sterling Anderson, who had been the director of Autopilot Programs at Tesla, and Drew Bagnell, who was autonomy architect and perception lead at the Uber Advanced Technology Center.
Google, Tesla and Uber. This is the stuff of proverbial dream teams.
Aurora has announced that it is working with companies including Volkswagen, Hyundai and NVIDIA, working to develop autonomous driving capabilities, creating what Urmson refers to as “the driver.” This encompasses such things as the software and the reference architecture of the hardware. While it will be something of a common platform, Urmson says that automakers will tailor it to meet the needs of their particular vehicles and customers.
Initially, Urmson thinks that autonomous vehicles will be deployed in mobility services applications. As such, initial deployment will be somewhat different than in OEM applications.
That is, there are some people who say that it is necessary to have a fundamental re-architecting of the vehicle’s electrical and electronic architecture. “I don’t think the vehicle architecture needs to start from scratch,” Urmson says of current models. “A lot of the interfaces are there already and we can overlay the autonomous capability.”
He acknowledges, “Over time there may be cost efficiencies from tighter integration,” but he points out that the economics for mobility service applications are different than is the case in the consumer-purchase model.
What is likely to happen is that as the technology moves toward private ownership, which is more price sensitive, the architecture will become more autonomous specific.
When it comes to sensors, with the primary ones being cameras, LiDAR and radar, Urmson believes that early on, “You’ll probably end up using all three because the failure modes are going to be somewhat different for each of them. So you can imagine that if one sensor doesn’t see something, the other two are likely to. That means the likelihood you missed something in the world is low, so the probability is that you are safe on the road.” He notes, however, “Over time, perhaps, our ability to interpret data from a subset of those sensors will be good enough so we can fall back to a smaller number of sensors, but given how much social benefit we can get [from autonomous driving] and the economics involved, getting this to market safely and quickly matters, so using a combination of sensors would be the right answer.”
Another issue related to technology and economics is that of using redundant systems.
Urmson thinks that there may be another way to go. He explains: “What you need to achieve is a level of reliability that meets your safety case. And you might generate that through diversity. For example, having different sets of sensors that do similar things. You might achieve that by limiting the operating environment.”
Urmson continues, “When it comes to computation, you have to look at the failure modes—what are the effects of those and what is the most efficient way to mitigate them?”
For example, he points out that if his address book happens to be out for a few minutes, it probably isn’t going to have a devastating effect.
“It may be that redundancy may be the most obvious answer, but there may be ways to achieve high reliability, or at least sufficient reliability, to get to a safe state without being truly redundant. One approach I’ve heard of is that you take a single actuator and you might have multiple windings in that actuator. When it is normally used you get full torque and if one of them burns out, you can detect that and get two-thirds torque.”
There are, he says, “Different ways you can achieve reliability without necessarily going to redundancy.”
Still: “Do I think in practice many of the solutions will end up in redundancy? Probably. But for computation, maybe you divide the workload in a way so that it is OK if it fails because you’re going to bring the vehicle to a safe state.”
Safety—and saving lives—while expanding the access to mobility is key to the mission of Urmson and his Aurora colleagues.
Will self-driving, or autonomous, vehicles mark the end of steering wheels?
Visteon Corp. is developing DriveCore, an open platform to control and operate autonomous vehicles.
Akio Toyoda, president of Toyota Motor Corp., said at CES today that his goal is to transform Toyota from being a car company to becoming a mobility company.