Computer vision, perception, path-planning and control, simulation-based robotic systems development, and human-machine interfaces are just some of the areas that Pete Rander has dealt with throughout his career. Rander holds a degree in electrical engineering from the University of Detroit-Mercy, as well as a master’s and a Ph.D. in electrical and computer engineering from Carnegie Mellon.
Anybody who follows the development of autonomous vehicle technology can quickly connect a couple of dots there—the Detroit connection with Carnegie Mellon, which happens to be one of the places from which many of the leading developers of autonomous driving technology have come from (the DARPA Urban Challenge was won by the Tartan Racing team from Carnegie Mellon)—and so it is not at all surprising that Pete Rander happens to be the president and co-founder (with Brian Salesky) of Argo AI (argo.ai), which is an artificial intelligence and robotics company that is dedicated to helping develop autonomous driving capabilities; a company that Ford Motor has made a $1-billion investment in. Pre-Argo AI, Rander was at the National Robotics Engineering Center at Carnegie Mellon and he worked at Uber’s self-driving vehicle tech operation.
“I’ve worked on so many ‘self-driving’ systems, so many autonomous systems, that if it had wheels or tracks, if it floated or flew, I’ve been on a team that helped automate it,” Rander says. One rather usual example: he worked on a project that involved an autonomous Sikorsky helicopter that carried a one-ton autonomous ground vehicle: the helicopter would fly to a determined place where it would deposit the ground vehicle, which would then travel to perform long-range reconnaissance in dangerous areas.
Speaking of his variegated experiences prior to Argo AI, Rander says, “I’d come home from work at night and talk to my wife and kids about what I was doing and they’d say, ‘Why don’t you do something practical, like build a self-driving car.’ It has almost become a family joke.”
But it has also become something that Rander and his team are highly devoted to. “Every time we have a group of new hires,” he says, “I sit down with them and talk to them about our mission and values. One of the things that I want us to do as a company is not just improve the occupants of the vehicles we’re involved with, but everyone around us. Everyone benefits, even if you’re not our customer. It is a wonderful thing.”
Rander notes the number of motor vehicle deaths that occurred in the U.S. in 2017—40,100, according to the National Safety Council—as well as those that occur globally—the World Health Organization puts the number at more than 1.25-million—and he and his team are working to develop the technologies and systems that can reduce that number. And to do so in a way that can be done in a way that is sustainable and profitable from a business standpoint. (Ford didn’t make its investment in Argo AI for purely altruistic reasons.)
A thing that’s striking about what Argo AI is doing as it works to develop the intelligence needed to achieve Level 4 automation—meaning that it is a vehicle capable of operating without a driver within a defined area (“geo-fenced”) and under specific weather conditions; Ford has announced plans to launch a purpose-built Level 4 vehicle that will be initially deployed in commercial operations (e.g., delivery of a variety of goods, including pizzas) and in mobility services by 2021—is the level of complexity involved. Consider, for example, a stop sign. Everyone knows that a stop sign is red and octagonal. But Rander points out that in the real world there are various reds, not some ideal color, and that stop signs are located on the sides of roads. “The bulk of the work is getting out in the real world and experiencing it in context. That’s proven to be the fastest, most effective way to get the most-accurate data,” he says. This real-world experience is going to allow the autonomous vehicle to better understand the real world, rather than some idealized human description.
Or consider cars. It is one thing for there to be a car in a studio, and quite another where the lighting isn’t as good. What’s more, when you’re driving, chances are you only see a portion of the vehicles around you at any point in time, so it is a matter of being able to contextually understand what is what—and where, with the location not only being of the surrounding objects, but of the vehicle itself in relation to them, some of which are moving and some of which aren’t. (“Every day you drive inches away from a row of parked cars and don’t think anything about it, but you pay attention if you suddenly see brake lights, because that’s the first thing you see when someone starts a car and is about to pull out.” It isn’t simply a matter of knowing what’s going on at any one time, but of also being able to anticipate what might happen and be ready to react appropriately: the person in the parked vehicle who has hit the brake pedal may not be planning to pull out anytime soon—but then they might.)
“A lot of people think about AI [artificial intelligence] as the brain. But at Argo AI we focus on the head. The head has to sense the world, make sense of it, and formulate what you want to do. And there is a nervous system that is going to try to give instructions to the muscles,” Rander says. If the AI system is the head, the nervous system is the electrical architecture and the muscles are the various elements of the vehicle. So Argo AI is working closely with Ford so that the head is intimately integrated with the rest of the body: “Maybe there will be kits out there some day, but the head and the body are pretty intertwined.”
There are sensors—cameras, radars and LiDARs. There are HD maps that are created with the sensor data (“HD maps are the memory that the car has of driven in an area before. A human drives a heck of a lot better in an area they’re family with because they have a memory of it. That’s what an HD map is.” AI memory) There are processors and there are the algorithms that run on these processors. Rander: “We view it as a holistic systems problem where there are an amazing number of technical challenges across the board. Our jobs are bringing all of those things together in a way that’s harmonious and balanced.” He points out: “It’s wonderful to have the world’s best cameras, but if you can’t make use of that data, it doesn’t help you. If you have some great algorithm but it is coupled to an implementation that can’t run fast enough, that won’t help you. These things coming together is what’s enabling this.”
This overall complexity is something that Rander thinks is perhaps underappreciated by those people who want to know why there aren’t Level 4 systems in abundance out there already.
It isn’t just a pure technical problem, not just one of the sensors and the processors and the actuators. “The system is not it isolation. It has to operate with other entities on the road, and they don’t all follow the rules.”
Well, they may follow the social rules, but not necessarily the legal rules. Which brings up another challenge that the autonomous vehicle system has to manage. One of the cities that Argo AI is testing vehicles on public roads (there are two safety drivers on board to make sure that the occurrence of an accident is minimized) is Pittsburgh. Rander says that there is what is known as a “Pittsburgh left.” That is, on a road with no left turn lane a vehicle making a left that is stopped at a light will do so immediately upon the green. Which is something that doesn’t occur in other locales, such as in Metro Detroit, where they’re also testing. (In Detroit, the general practice is for two cars to go left through on a yellow, which is socially acceptable but legally dubious. Someone trying to do a Pittsburgh left in Detroit would probably spend time in a repair shop and vice versa.)
“It is hugely important from an Argo AI standpoint that our systems fit in in a very humanistic way, a very natural way,” Rander says, which means that not only do they have to solve the technical challenges, but there needs to be work done with governments at all levels in order to have the vehicles operate within the flow, within the norms: “We can’t have every autonomous vehicle pulled over”
Automotive manufacturers are meeting CAFE fuel-efficiency standards through lightweighting, which requires simulation software for design engineers.
A young(ish) guy that I’ve known for a number of years, a man who spent the better part of his career writing for auto buff books and who is a car racer on the side, mentioned to me that his wife has a used Lexus ES Hybrid.
By James Gaffney, Product Engineer, Precision Grinding and Patrick D. Redington, Manager, Precision Grinding Business Unit, Norton Company (Worcester, MA)