Although the terminology for things related to ADAS—advanced driver assistance systems—is thrown around like it has been the norm for years, defining the term seems to be a good place to start when talking with Omer Keilaf and Oren Rosenzweig of Innoviz (innoviz.tech), a company based in Kfar Saba, Israel, about the technologies that they’re developing, which are called, not particularly descriptively, the Innoviz Pro and InnovizOne.
What they are working on is LiDAR for ADAS applications—yes, all the way to SAE Level 5 (“Level 5—Full Automation: The full-time performance by an Automated Driving System of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver”). But for the steps getting there, too.
Let’s get back to the issue of definition. First of all, realize that “LiDAR,” like “ADAS,” is an acronym. It stands for “Light Detection and Ranging.”
So when asked to simply describe it, Rosenzweig simply says, “It is like laser-based radar.”
Unpacking that still further, know that both “laser” and “radar” are acronyms, too, but they’ve become so common as to be generally overlooked as being a series of letters that have become, through use, into words. However, it is worth refreshing oneself by knowing that “laser,” which in its early days was written with all caps like acronyms generally are, stands for “Light Amplification by Stimulated Emission of Radiation” and “radar” “Radio Detection and Ranging.” (Presumably, LiDAR will, when it becomes an even greater part of the daily parlance, will simply become “lidar.”
So LiDAR uses light rather than electromagnetic waves (yes, light is an electromagnetic wave, too, but it is easier to consider it this way). And without going all Einstein, we know that the speed of light is a constant, so when you want to determine the relationship between two things, there is that rather useful parameter that is there from the start. And the radar part of Rosenzweig’s simple definition goes to the point that this is a system that sends out waves then measures the amount of time that it takes for it to be reflected back from the objects that it hits.
The Innoviz approach to engineering its system is somewhat different than some of the LiDAR systems that have been seen over the past few years on the roofs of vehicles, those things that have been sometimes described as “rotating KFC buckets.”
Rosenzweig says that those systems are essentially lasers stacked on top of one another—sometimes 32 lasers, sometimes 64 lasers—and rotated. “They usually turn out to be very expensive systems and not very reliable.” They are mechanically complex.
What they’re doing involves no large spinning or moving mechanical parts. They use MEMS—as in “Micro-Electronic-Mechanical Systems”—and semiconductor components. There is a tiny mirror that is used to steer the laser. They’ve also developed a highly sensitive sensor as well as a high-compute power chip that is used to create a 3D image.
One of the reasons why their system uses fewer lasers per unit—it could be one, it could be four—is because the laser is directed in a raster pattern so that there is horizontal and vertical scanning. (The InnovizOne has a field of view of 125 x 25 degrees.)
This is helping them work toward a price point that Rosenzweig describes as being “relevant for automotive—hundreds of dollars.” (He says that some LiDAR systems on the market today—admittedly systems that are being produced in prototype, not production, volumes—can be as high as $70,000.)
An interesting aspect in all of this is that LiDAR isn’t the entire answer to autonomous driving. Cameras and radar have roles to play, too. Rosenzweig: “The idea behind fusing different modalities, using other sensors, is because each sensor has a strength and some weaknesses.” So, for example, he says that cameras provide good images, but don’t do well in bad weather or extreme lighting conditions. There isn’t 3D information. There isn’t distance information. But they can read signs and know what color the light is on a traffic signal. Radar is good for long ranges and weather doesn’t affect it as much. But the resolution isn’t so high.
Lidar does well in bad weather, provides accurate 3D information and provides high resolution such that it can detect small objects. But it can’t read signs or traffic signals. And compared with radar, if there is fog, strong rain or snow, the LiDAR system may have a range of, say, 250 meters while radar can go 300 to 400 meters.
“Cameras are not going away. And you also need radar. But you have to complement them with LiDAR,” he says, suggesting that the combination—the “fusion”—helps provide coverage of conditions.
While Innoviz has had prototype units in the field, Rosenzweig says that 2018 and 2019 are when the products are going to be transitioning into production volumes.
“We’ve been working with some OEMs who plan to have LiDAR in their vehicles by 2020-2021,” Rosenzweig says, adding, “It’s already in purchasing, not a question of if it is going to happen.”