The Evolution to Autonomy
There are evolutions—step-by-step changes, gradual developments that result in something that is more robust than its predecessor (without getting too Darwinian about it)—and then there are revolutions—sudden transformations and upheavals of the status quo such that there is a new order or system in place.
As Karl Haupt, executive vice president Advanced Driver Assistance Systems Business Unit, Chassis & Safety Division, Continental Corp. (continental-corporation.com), sees it, “We see automated driving as the evolution of ADAS.” Advanced driver assistance systems. “We do not have a revolution, like in electric mobility, where you go from the combustion engine to an electric motor, with a completely different infrastructure.”
Haupt says that the transition to fully autonomous, or automated, driving is something that will occur primarily through the addition of more sensors and more software: “The infrastructure of the car is already there. What we are doing is putting in sensors to understand what is going on around the car and then intelligence in the form of computing power and software to put the information that is coming from different sensors together.” That is then used to decide what the course of action is, and actuators are brought into play.
“Depending on the level of automation,” Haupt says, “you can use standard actuators that have a connection to the electronics control or a special setup of actuators for when the human isn’t there”—well, the human may be in the car, just not behind the wheel—“so if the system fails, the car comes back to a safe situation.”
This is essentially an evolutionary approach in that Haupt points out that current advanced electronic stability control (ESC) systems work not only with the brakes, but the steering and the throttle, as well, making determinations of conditions with the information obtained via various sensors, then acting on them. So ESC is a predicate of ADAS.
Continental is a company that can be considered to be a “traditional” automotive supplier, as can be its competitors like ZF Group (zf.com), Robert Bosch Corp. (bosch.com), and Denso (denso.com). There are a variety of companies that have started up in places like Sunnyvale and Palo Alto that aren’t as recognizable as Alphabet’s Waymo, but are nonetheless using their software and sensor chops to create systems for autonomous driving, companies like AutoX (autox.ai) and Renovo.auto (renovo.auto). Their URLs alone give a sense of what they are doing. If you look at the types of jobs they have open—in robotics, computer vision, software architecture, embedded systems, motion planning and control, deep learning, simulation, cloud software engineering—there is the clear sense that there is a digital bias in their approaches, something that is certainly needed for the evolution to automated driving, but only part of the advance.
Haupt suggests that beyond working with sensors like radar and LiDAR, beyond data fusion, there is a need for an understanding of the car and the relevant controls. Which makes the traditional companies not scrap-heap destined as they might have seemed to some in Silicon Valley (some who have yet to be acquired by or partnered with the traditional companies).
He uses, for example, traffic sign recognition. This is something that is on the one hand somewhat straight forward—it is a matter of collecting data—but critical—a stop sign isn’t the same thing as one for a railroad crossing, yet both are important pieces of information in their own ways. “You can do it without a lot of understanding of the car,” Haupt says, adding, “But when you do it for adaptive cruise control, it is a lot about understanding the traffic situation and vehicle dynamics.” When that train crossing gate comes down, there needs to be a sufficient amount of time and space to bring a vehicle to a safe stop without spilling the Starbucks all over the interior of the vehicle. “You need some experience,” he says. “You need some testing. You need to understand some physics. And you just can’t do it at your desk.”
“Yes, he admits, “verification of functions is done by simulations.” Which are run at desks. And which create conditions and situations that couldn’t be proven by just driving around. That said, there still needs to be work done in the real world, where wheels must turn and binders applied.
One of the challenges that Haupt says can’t be underestimated is the amount of computing power that needs to be on-board the vehicles. He says that information is coming in at a rate of 30 times per second and it must undergo the calculations necessary to keep the vehicle where it is supposed to be.
That said, he thinks that by 2021 there will be vehicles that will be able to drive automatically. “Not from every point A to every point B,” Haupt notes.
But things will continue to evolve from there.
Mazda, the Little Car Company That Can, has been working on a number of important fronts of late.
While there is a burgeoning proliferation of companies that are in the LiDAR space, each with its own take on utilizing laser pulses to create a precise map of its surroundings for purposes of ADAS or full-blown automation, a Seattle-based company has a distinction that certainly sets it apart from its competitors.
In-car video shows that the backup pilot of an Uber Technologies self-driving car was not watching the road just before the vehicle struck and killed a pedestrian last Sunday night.