Driving into the Future
When you think of the auto industry, you undoubtedly think of companies including Mercedes, Audi, Volvo, Tesla, and Volkswagen. And when it comes to Tier One suppliers, it is companies including Autoliv (www.autoliv.com), Bosch (bosch-mobility-solutions.us), Hella (hella.com) and ZF (zf.com). (Yes, there seems to be a Euro-centric character to all this.)
What all of these companies have in common is that each of them is working with a company that I daresay you hadn’t heard of 10 years ago unless you were a serious gamer: NVIDIA (nvidia.com), a firm that started in 1993 as a graphics chip producer.
At the 2018 CES it seemed as though NVIDIA was fairly central to the autonomous driving efforts and announcements, whether it was ZF talking about how it, along with NVIDIA and Baidu, have developed what is said to be the first AI-based autonomous vehicle platform designed for China, which is to be launched in 2020, or Volkswagen CEO Dr. Herbert Diess on stage at an NVIDIA CES event saying, “Artificial intelligence is revolutionizing the car. Autonomous driving, zero tailpipe emission mobility and digital networking are virtually impossible without advances in AI and deep learning. Combining the imagination of Volkswagen with NVIDIA, the leader in AI technology, enables us to take a big step into the future.”
Arguably, the ZF-NVIDIA connection is somewhat understandable. ZF is a Tier One supplier and it has deep knowledge of what’s needed in the way of vehicle technology, and as it has a long history in mechanical products, NVIDIA provides the silicon smarts that can help ZF create a product like the ProAI car computer.
But it is rather surprising to hear such robust validation and praise coming from an OEM for a supplier as Diess said about NVIDIA.
It is somewhat difficult to wrap one’s mind around the capabilities provided by the AI car computer that the company has developed for the industry: DRIVE PX can be configured with four processors so that it is capable of performing 320-trillion deep learning operations per second.
Said another way, it can take all of the data from an array of sensors on a car, correlate with a high-definition map and figure out how the car can travel to where it needs to go.
Even though no one in the industry is really talking about Level 5 autonomous driving, NVIDIA claims that the DRIVE PX can enable it.
And maybe this is what’s behind all this. The auto industry recognizes that it cannot be behind the curve if it wants to maintain relevance—and margins—going forward, especially as it seems to be the case that when we get to Level 4-capable vehicles, they’ll be in fleets, such as Ubers, and there could be a reduction in the number of overall vehicles sold.
(That is: The technology will, initially, be probably more expensive than more than a handful of consumers would be interested in paying for. This means that the fleet business is where the early tranche of vehicles will go. As it is estimated that an individual uses her or his car only about five percent of the time, a fleet-operated vehicle would be running almost all the time and thereby could take some of the privately-owned vehicles out of the mix. And there is going to be an ever-growing number of people living in large cities. Stefan Hartung, member of the board of management of Bosch, said at CES: “By 2025, 34 cities worldwide will have a population of more than 10 million people. By 2050 at the latest, two-thirds of the world’s population will be living in these megacities, putting a tremendous amount of pressure on local infrastructure and the environment–and ultimately on our quality of life, whether the air we breathe, the time we waste in traffic, the energy we consume, or our safety.” Chances are, there aren’t a whole lot of people who are going to be owning private cars under those conditions.)
And it isn’t just NVIDIA. There are Intel (intel.com), Qualcomm (qualcomm.com/solutions/automotive), AMD (amd.com) and others that are in the mix.
There is a vast array of people who are writing the software that will facilitate autonomous driving.
In some ways, it is a different world. But actually, it is the same world, a world that is undergoing what one could argue is a rate of unprecedented change, driven on the one hand by tech vendors finding that there are new outlets for their products (e.g., NVIDIA was well known by gamers because of the way its GPUs—graphical processing units—could render film-like fidelity on computer screens), and by OEMs realizing that mobility isn’t just something that is tallied by monthly sales figures.
While the drive toward full autonomy for vehicles is seemingly picking up speed with announcements from companies with new LiDAR systems or the deployment of deep-learning AI systems in vehicles, and while Bosch is working hard in a number of areas to automate driving, here’s something that is striking that comes out of a study, “Connected Car Effect 2025,” that Bosch and the consulting firm Prognos performed.
“For the first time ever, we’re closing the loop on human behavior understanding through vision AI,” says Modar Alaoui, founder and CEO of Eyeris, a company that has developed technology for face analytics and emotion recognition, which has now extended that to body tracking and action and activity recognition.
Machine learning through pattern recognition can be a faster, less-expensive approach to monitoring manufacturing.