Stewart Reed has spent some 35 years in transportation design, with stints at OEMs including Toyota and Chrysler, with clients of his consultancy including Icon Aircraft, Lockheed Martin, Michelin, Hyundai, Nissan, and Ford. Reed is the chair of the Transportation Design Department at ArtCenter, where he is helping develop the next generation of transportation designers.
With autonomy, car design as we know it is dead, right? Isn’t it the case that automated vehicles (AVs) are going to lead to a situation where there will be fleets of undifferentiated pods that will take people from A to B?
Reed says no. “People will still love beauty, joyfulness, a sense of artistry.” He points out that people go to particular restaurants because of what they offer and how it is executed. People wear distinctive clothing, not highly functional jumpsuits of homogeneous design.
Yes, he acknowledges, much of the AV experience—and he thinks that there are going to be levels of “brand experience” with these vehicles—will be with the interior (obviously with the change in powertrain, the vehicles will be more human-centric than machine centric, with space being opened up by what are likely to be electric AVs, but that’s not all.
While there is sometimes an argument predicated on users of apps like Uber or Lyft that has it that when a given hailed vehicle rolls up, chances are the rider isn’t going to be at all concerned about the brand.
“Go back to the horse-drawn era,” Reed says. “Consider a Victorian coach versus a farm conveyance with tools in the back.” Yes, they can both take you somewhere. But the differences in approach are profound.
“There will still be an experience you want to provide to a person,” Reed says of the ride-hailed AV, which is possibly going to be operated not just by a third party but an OEM (which will have an image to maintain).
The fundamentals of the fundamentals.
From the point of view of the design of the basic architecture of the vehicles themselves, beyond styling (“I think we’ll get to the point where we’ll have wildly expressive vehicles that fit into different contexts, in different geographies”: the classic London black cab taken to new levels, Reed suggests).
Reed says that through the years vehicles have become too complicated. He uses the term “co-mingled.” He suggests that what needs to happen is that there is a fundamental redesign of the structure of vehicles—and he isn’t just talking cars and trucks, but buses and trucks and emergency vehicles (“They’re antiquated and clumsy”)—so that there is fundamental simplification.
Reed cites Gordan Murray’s work on iStream, a low-cost, efficient way of designing and producing vehicles; at what might seem to be the other end of the spectrum, he cites the composite chassis that McLaren uses for its vehicles, where the tub is surrounded by components that can be replaced but the key component stays the same. Reed: “We need more of that philosophy in mainstream vehicles.”
Elmar Frickenstein is senior vice president for Fully Automated Driving and Driver Assistance at BMW Group. Before joining BMW in 1988 he’d had positions in both the computer and aerospace industries, so he has an understanding of both technology and transportation in a way that goes beyond typical automotive approaches to advances. BMW has put its ConnectedDrive technology into all of its vehicles, which means that not only are all of the vehicles part of the “Internet of Things,” but over the 10 or so years that the technology has been available, BMW has worked with an array of companies including Microsoft, Amazon, Apple, Samsung, and many others.
Autonomous vehicles are just part of the evolution that we’ve seen in the auto industry, an industry that has seen changes throughout its history from self-starters to satellite radios, from V8s to gas-electric hybrids, right? This is development as usual, albeit one that includes some companies that one might not have ordinarily associated with auto, isn’t it?
Not at all.
Frickenstein: “Automated driving changes the automotive industry. For the last 100 years the automotive industry has well understood how to make cars, develop cars, put electronics into cars, put connectivity into cars.
“With automated driving there is a change. It is the biggest paradigm shift. We will be changing the business model. The responsibility of the driver can be moved to the machine.”
This is not just a 3-Series amped up with processors; it is an entirely new approach to what the “auto industry” in all of its manifestations.
And BMW’s drive in realizing this can be understood by a small thing: “Every BMW has its own SIM card.”
But BMW makes the “Ultimate Driving Machine.” And Frickenstein and his colleagues are creating automated driving technology. In 2021, with the introduction of the Vision iNEXT, a fully electric crossover with Level 3 autonomous driving capability, which means the driver can take her hands off the steering wheels and foot from the pedals, isn’t it the case that the BMW will drive like any other Level 3 car?
“From the planning, yes,” Frickenstein says. “From the drivetrain, no.” He explains that the route will be calculated for autonomously going from point A to point B. For the sake of argument, it could be the exact same planning that is done by an autonomous vehicle produced by any other company. For example, BMW and Fiat Chrysler are partnering on the development of autonomous technology.
“The characteristics of a Fiat, a Chrysler, a BMW, and a Rolls-Royce are different. The feeling in the cars are different,” Frickenstein says. And the powertrains are different.
So will an Ultimate Driving Machine exist in an autonomous future?
“The motion controller defines the sportiness or the comfort of an automated vehicle,” Frickenstein says. “So we can adjust the characteristics of the vehicle through the motion controller.”
(During the world premier of the Vision iNEXT Concept in Los Angeles in November 2018, Klaus Fröhlich, BMW board member in charge of Development, said that when the production vehicle launches in 2021: “It will be a true BMW when it comes to driving dynamics. It will have best-in-class acceleration.” Fröhlich also cited its excellent power-to-weight ratio, coming from a mixed-material construction, and a low center of gravity, primarily a function of the position of the batteries for electric propulsion.)
You can still drive.
There are two modes that are push-button activated: Boost and Ease. The former is for hands-on driving. As for Ease, Frickenstein says the system in the iNEXT allows the driver to engage it on a highway and then do something else. Do email. Read. But not sleep (there is a camera system in the vehicle that watches the driver’s eyes to ascertain whether the lids are up or not). The requirement is that the driver has to be able to regain control within 10 seconds. If that doesn’t happen, then the car reduces speed and pulls off to the right side of the road.
And if that’s Level 3 in 2021, then there could be “maybe in ’22 or ’23 Level 4 on the highway that would allow you to sleep in the car.”
But there are some requirements that have to be fulfilled before that can happen. Like lidar that is capable of looking ahead by 200 meters. And importantly, neural networks and simulations.
Frickenstein points out that they’ve deployed 34 different technologies on vehicles. Various sensors of the lidar, radar, camera, and ultrasonic varieties. In terms of technology they are working closely with Intel and Mobileye. In terms of Tier Ones its Continental, Magna and Aptiv. They are working to make this production-ready and automotive-safe.
According to Dr. Klaus Buttner, BMW vice president Functions, Software & Integration Platforms, “AI and machine learning are indispensable to successfully understand and train the right driving behavior.”
He adds—and emphasis added because this is important—“The car will not learn itself.”
The approach that BMW is taking is one where all of the sensors on the fleets of vehicles that it has running in places like Munich, San Francisco and Shanghai is to send the information they’ve collected to a back end data center where it used, says Frickenstein, “to create a virtual world from the real world. We are creating an environmental model from the real world so that our computers can ‘see” all of the obstacles and objects and ‘see’ what happens over a given time frame.”
What’s more, they are tagging, or labeling, all of the items that are detected in the real world such that this information can be used by the neural networks to “learn” what these objects are and what they do.
All of this work at the back end is then downloaded to the vehicles, which can then use what has been learned to be able to control the vehicle. Then, as the computers at the back end—and Frickenstein says that this is a massive network of computers, “hundreds and hundreds of petabytes”—continue to take in new information, create new simulations, this processed information can then be downloaded to new or existing vehicles, making them more capable, working toward Levels 4 and 5.
What of the steering wheel?
“Level 5 is without pedals or steering wheel,” Frickenstein says. “Level 4 can be as the brand defines it.” Which means that it is likely that a BMW Level 4 vehicle will have the availability of a steering wheel because that’s what’s part and parcel of the experience that one would expect from the vehicle.
And odds are good that even 10 years from now, when there will undoubtedly be a proliferation of vehicles that have highly advanced autonomous capabilities, there will still be a non-trivial market for steering wheels.
“Given the technical challenges in front of us,” Frickenstein says, “I don’t believe that we will be ready in 10 years to eliminate steering wheels. You’d have to change all of the architectures of all of the vehicles.” And by “all” this encompasses vehicles from all marques, not just those of the BMW Group.
Yet while this is a challenge for some parts of the world, where the traffic infrastructure looks more like a traffic free-for-all, Frickenstein suggests that in some cities, where automated driving has been taken into account, “it is do-able.”
Not likely. But do-able.
NVIDIA was established in 1993, based, in part, on the idea that the PC would become a consumer device for gaming and multimedia, applications that are graphics-intensive. (1993, in the history of computers, was when Microsoft launched NT and the first Intel Pentium chip was introduced.) By 1999 NVIDIA invented the GPU—the graphics processing unit—which not only transformed what consumers could see on their screens, but provided professional designers with advanced capabilities. By 2010 Audi selected NVIDIA GPUs for its infotainment system. And today NVIDIA is working with 370 OEMs, Tier One suppliers, mobility service suppliers, sensors and hardware developers, mapping suppliers, and researchers, using GPUs and software to create autonomous driving capabilities.
Do “Fortnight,” “Battlefield V,” “Final Fantasy XV,” and the like have anything to do with self-driving cars? Well, certainly not directly. But as one thing leads to another, just maybe these virtual worlds can have a consequence on transportation in the real world.
“It’s not just hardware. That is something important to remember,” says Danny Shapiro, senior director of Automotive at NVIDIA (nvidia.com).
But the hardware is something that is hard not to have top-of-mind. Consider, there is a processor—or as Shapiro puts it, “a supercomputer that we’ve shrunk down, is auto-grade and in production now”—named “Xavier.” It is, he says, “the world’s first processor designed for autonomous machines.” It is a “system on a chip,” or SoC. It has six different types of processors, from a CPU to a GPU, from a deep-learning accelerator to a programmable vision accelerator among them.
And the Xavier operates at 30 TOPS. Which is 30-trillion operations per second. NVIDIA has taken the Xavier and developed what it calls “DRIVE AGX Xavier,” which is a platform that can be used to process sensor data, to provide vehicle localization and to perform path planning. And if that’s not enough, there’s DRIVE AGX Pegasus, which has two Xaviers, two NVIDIA Tensor Core GPUs, and which operates at up to 320 TOPS.
The world is a complicated place. 320-trillion operations per second can, presumably, help make sense of it for autonomous vehicles.
But because it is not just hardware, Shapiro says that they are writing lots of software, like DRIVE OS, a real-time operating system; DRIVE IX, which is for sensors and user experience; and DRIVE AV, an autonomous vehicle software stack. There are modules and algorithms and libraries.
“We are building the whole end-to-end stack,” Shapiro says. “The reason we’re doing that is we feel we have to understand the entire problem so that we can develop the hardware and software.”
NVIDIA is running its own autonomous test vehicles. In October 2018 it ran a vehicle (nicknamed BB8) on a 50-mile loop around Silicon Valley, on four different freeways, including the on and off ramps. The vehicle (a Ford Fusion) was fitted with a DRIVE AGX Pegasus that ran DRIVE AV and DRIVE IX software. The safety driver didn’t have to engage. All of the technology is commercially available.
Which brings us back to the aforementioned games. “In the video game space,” Shapiro explains, “NVIDIA doesn’t make the game. But we have all of the rendering libraries, the ray tracing, the shadowing, the lighting, the texturing—all of the tools for the game makers to build off of. The same in the driving space.”
But one issue that Shapiro emphasizes, an issue that isn’t something that’s of concern to those in the gaming world (at least not physically, cases of carpel tunnel syndrome notwithstanding) is safety.
“Safety is paramount,” he says. For one thing, the aforementioned different processors on the Xavier provide, in effect, some checks and balances of what’s going on in the real world as information comes into the vehicle. For another, they’re developing the software and running their own physical and digital testing so as to create the means by which the autonomous driving technology can be deployed in a safe, reliable manner. And they are working closely with OEMs and Tier Ones (as well as the other participants in the automated driving ecosphere) on systems.
Shapiro says that they’re working with the U.S. Department of Transportation and the European TÜV SÜD on developing a “virtual driver’s license,” a testing regiment for autonomous vehicle systems.
Shapiro says, “This is first and foremost about making the streets safer.”
He adds, with a smile, “Then getting your life back in traffic jams.”
China car-sharing giant DiDi Chuxing plans to invest $1 billion into its new Xiaoju Automobile Solutions business.
Ford Motor Co. aims to launch the world’s largest fleet of self-driving taxis in 2021.
The latest wave of manufacturing execution systems takes advantage of the Internet of Things, leading to simpler and faster implementations and truly real-time data analysis, decision-making, and problem resolution.