Quick Takes: Sensor Suites and Higher-Level Autonomy
Lidar developer LeddarTech is hosting a series of webinars on autonomous driving and new mobility services. The latest session focused on Multiple Sensing Modalities: The Key to Level 3-5 Autonomy.
The panel featured a pair of similarly named CTOs: Pierre Oliver of LeddarTech and Pierre Lefrevre of Coast Autonomous, a mobility-as-a-service and self-driving vehicle developer. They were joined by Daniel Sisco, senior automotive director for chip giant Renesas.
Here are a few highlights and insights:
Putting the Plus in L2
Oliver: “Level 2 is the first stage where the car starts to drive itself. But it’s not very good. …. Level 2+ is Level 2 that works. It’s what most people would like today.”
Lefrevre: “We don’t build autonomous vehicles. We build road missions with a focus on safety.
“This requires both long- and very close-range perception… (and) a lot of redundancy in field of view, range and technology.”
Oliver: “We’re still trying to catch up and deliver lidar that is at a similar maturity level as cameras and radar. Getting sufficient processing at an affordable cost and power budget remains a challenge.”
Catching Up to Humans
Sisco: “The human brain does incredible parallel processing that we are struggling to emulate from the compute perspective. You can have the largest sensor suite in the world. But at the end of the day you have to more or less serially compress all of that down to a scene and make decisions based on the environment.”
“Is there a sensor that replicates the instinct of a human, who may see some flash of light or hear the tires on the road to sense something has gone wrong and feel that the vehicle is drifting a little, so I know, because of my brain and experience that I probably have a flat tire? We have to artificially replicate these in autonomous systems.”
Oliver: “ADAS is very much about safety and convenience. But all evidence shows that ADAS today is not very good in instrumented tests or consumer feedback. Many people turn off these features because they aren’t delivering what they claim. There is room for much improved ADAS.”
Computer Power and the Cloud
Sisco: “Compute power per watt is key. Cooling systems, power dissipation, even generating the power to run these ECUs has a big effect on the vehicle system overall.
“We have to think how we interact with that cloud, during run time, development, testing. How do we get data back and forth between the cloud and the edge and what role does it play?”
Oliver: “We can all use better sensors and faster processing.
“The key is in the processing and perception. It’s about how can the automated vehicle better emulate the behavior of the human?
“How do you better process all the sensor input and leverage the cues and infrastructure to deliver solutions that match or come close to human drivers?”
People have been dreaming about flying cars since the early days of the auto and aircraft industries.
Although all OEMs and suppliers do their utmost best to assure nothing but top-notch quality is achieved for their vehicles and systems, sometimes things simply go wrong because, well, that’s just how the Universe is.
While at the Tokyo Motor Show this week various vehicle manufacturers were showing off all manner of cars and crossovers and transportation devices that typically had to do with something autonomous, connected and/or electrified (ACE, as CAR’s Brett Smith categorizes this burgeoning field), the guys from Chevy were in El Segundo, California, showing off a different take on what can best be described as “toys for boys”—boys who do or don’t have driver’s licenses.