As a pioneer in autonomous driving, Google Waymo’s every move has been closely watched. Inside Waymo’s Secret World for Training Self-Driving Cars, Inside Waymo’s Secret World for Training Self-driving Cars offers an exclusive look at why Waymo stays ahead of the curve on the self-driving track.


Compile | AI technology base (rgznai100)

Participate in | Shawn, Zhou Xiang


In a corner of Alphabet’s campus, a team is working on a software project that could hold the key to the future of driverless cars. No reporter had ever been able to see it before us. They called it Carcaft, a name inspired by Blizzard’s World of Warcraft game.

The creator of the software, James Stout, was a shaggy, baby-faced young engineer. He is sitting in a quiet open-plan office, his eyes fixed on a virtual image of a roundabout on a screen. At first glance, it looks nothing special, just a few simple lines that outline the structural information of the road. At this virtual roundabout, we can see a Chrysler Pacifica driverless car in medium resolution, and a simple square representing another car.



Waymo’s simulation scenario-building software, Carcraft

Waymo’s self-driving car team came across one such roundabout in Texas a few months ago. The speed and complexity of the road at the time overwhelmed their self-driving car, so they decided to build a nearly identical physical road at their test site. What I am looking at now is the third step of the learning process: the digitization of the actual driving scene. In this step, a simple real-world driving action, such as overtaking a car on a roundabout, can lead to thousands of simulation scenarios to test the boundaries of an autonomous car’s driving capabilities.


Such scenarios lay the foundation for the company’s powerful simulation tools. “Most of the work is inspired by problems encountered in simulation scenarios.” Stout said. This is the tool that accelerated the development of Waymo’s self-driving cars. Waymo was spun out of Alphabet’s “X” research unit, which is responsible for advanced projects, in December 2016.

If Waymo offers fully autonomous cars in a few years, Carcraft will forever be remembered as a virtual representation of the real world that did much to reshape the display world.

Carcraft was originally designed to “replay” scenes experienced by vehicles on public roads, but it (and scene simulation in general) is now playing an increasingly important role in autonomous driving research.

Waymo now has 25,000 virtual self-driving cars driving through simulated versions of virtual Austin, Mountain View, Phoenix and test tracks. Self-driving cars can run thousands of times a day on extremely complex virtual roads. All told, self-driving cars could drive 8 million miles a day in virtual worlds. Waymo drove a total of 2.5 billion miles in the virtual world in 2016, while Google’s IRL self-driving cars logged just over 3 million miles tested on public roads, orders of magnitude less. And crucially, Waymo calls its test miles “fun” miles, where they can learn new things, as opposed to cookie-cutter highways.


Of course, scenario simulation is only part of Waymo’s complex work. They tied millions of miles of testing on public roads to a “structured testing” program at a secret base in Central Valley that Waymo calls Castle.

Waymo has never made the system public before. Testing self-driving cars on regular roads could tell them where more testing is needed. They recreated the testing space in the Castle, allowing the car to be tested in thousands of different road scenarios. In both real-world tests, they can gather enough data to create an all-electronic simulation at some point in the future. In that virtual space, they were able to create thousands of variations of any one scene, free from the constraints of the real world, and test digital vehicles in all of them. As the driving software improves, the system will be downloaded to real cars and tested on longer distances and more complex roads. Virtual and real world training will continue to cycle back and forth, improving the capabilities of the autopilot system.


To get to Castle, you drive east from the Bay Area, get on Route 99 to Fresno and head south. The horizon disappears at the end of a field of corn as far as the eye can see. The terrain is so flat that author John McPhee described it as an “Earthen Sea.” Get off the highway near the town of Atwater. Castle Air Force Base used to be here, and it employed 6,000 people working on the B-52 program. It now belongs to the municipality of Merced, on the northern edge of its urban area. Unemployment was over 20% in 2010 and rarely falls below 10% today. Forty percent of the population in the area is Spanish-speaking. We crossed railway lines across the old 1,621-acre site, which houses the City of Merced’s Animal control Department and the Atwater Jail.

Instead of a specific address, the navigation software on my phone showed up a set of GPS coordinates. We drove along a high, opaque green fence until Google Maps told us to stop. There’s no sign of a door, it just looks like another part of the fence. But Waymo’s entourage was pretty sure it was here. Sure enough, a security officer appeared, opened the fence and checked our papers.




The Fence (Alexis Madrigal)

When the fence opened, we entered a small, bustling park where young men in short sleeves and hats walked around. There are mobile buildings, a garage on the roof, and a lot of self-driving cars in the parking lot of the main building. There are a variety of retrofitted autonomous vehicles, including lexus models you often see on the road, retired Toyota Priuses and new Chrysler Pacifica minivans.


A self-driving car is easy to identify because it has so many sensors on its body. The most conspicuous feature, of course, is the laser scanner (commonly known as a LIDAR) on the roof. The Pacifica is fitted with a LIDAR the size of a beer can that rotates constantly around its mirrors. The rear of the car has radar that looks like shrek’s ears.

When the car’s sensors activate, the spinning LIDAR makes an eerie noise, even when the car is parked. It sounded like a squeak and a crash, and my ears weren’t used to the new sound.

Across the street from the main building sits an even more unusual car, plastered with different-sized X’s made of red tape. This is a Level 4 self-driving car. The level is defined by the Society of Automotive Engineers as the degree to which a vehicle is automated. Most of the self-driving cars we hear about on the road are Level 1 or Level 2 cars, which are cars that can drive intelligently on the highway. But this one we’re looking at is different. Not only is it fully autonomous, but humans cannot drive it, so it cannot be confused with other cars.

As we pulled into the parking lot, it was like a scene out of the Manhattan Project, and the smell of research centers and tech startups came to us. In a classroom-sized activity office in the main building, I finally met the man behind this secret base: Steph Villegas.

Villegas wore a long but well-fitting white collared shirt, rags jeans and gray knit sneakers as stylish as when he worked at San Francisco boutique Azalea. She grew up in the suburbs of East Bay, California, and graduated in fine arts from the University of California, Berkeley, before joining Google’s self-driving car project in 2011.

“Do you drive?” I asked her.

“Of course.” Villegas replied.

She often drives highways 101 and 280, the main route between San Francisco and Mountain View. Like other drivers, she has a sense of how the car is moving on the road, which is important in self-driving projects. “They can intuitively imagine situations that would be difficult for an autonomous car to handle.” While testing the newly developed software, I started thinking about how we could create some challenges for the system.” Villegas says.

So she and a group of engineers began to invent and construct rare scenarios in which they could test new behaviors of self-driving cars in a controlled way. They commandeered some of the local parking lots at Shoreline Amphitheater and barricaded all entrances so that no one but approved Google workers could enter.

“So we began our plan.” “Every week, a couple of other engineers and I would come up with something we wanted to test, and then we would truck the equipment we wanted to use to the parking lot to test it,” she says.

These became the first “structured tests” of the autonomous driving project. The hardest part of the test wasn’t driving in an imaginary “zombies eating people on the road” scenario, but steering the car confidently and reliably in the ever-changing environment of normal traffic like a human driver.

Villegas has since collected mannequins, cones, fake plants, children’s toys, skateboards, tricycles, dolls, balls and decorations. All of these items were kept in a small warehouse and are now being transported to Castle base and kept in a complete warehouse.



                                   
Castle’s Props Warehouse (Alexis Madrigal)

But the problem is, they make cars go faster and test them with traffic lights and stop signs. But the Festival at Shoreline Amphitheater often throws off plans. “For example, Metallica is coming to play, and we have to hit the road to test it.” She said.


They need a base, a secret base. That’s what Castle is for. They rented a space and started building what they imagined would be a virtual city. “We wisely decided to design and build residential streets, expressways, hutongs, parking lots, etc., so we built roads with representative characteristics for vehicle testing.”

We walked from the event office to her car, and as we drove off to visit the base she gave me a map. It’s like Disneyland. You can follow the map.” Villegas says. The map was carefully drawn. In one corner stands a Vegas-style sign that says “Welcome to the charming California Castle Base.” Every road on the site is named after famous models (such as DeLorean, Bullitt) or original Priuses used earlier in the project (such as Barbaro).


We walked through a cluster of pink buildings that had once been military dormitories, one of them renovated. It’s a resting place for Waymo workers when they can’t get back to the Bay Area. Other than that, there are no other buildings on the test site. It really lives up to its name as a self-driving car city: asphalt roads and obstacles on the road are the most important.



                                 
A “block” of Castle (Alexis Madrigal)

Being in a place like this is like being in a video game setting with no human characters. It was a strange feeling to drive from the main street to a block street with concrete driveways, and then to a suburban crossroad with no buildings on either side. I always feel like I’ve seen the road before.


We drove into a large two-lane roundabout surrounded by a white fence. “We started with a one-lane roundabout, but our autopilot team came across a multi-lane roundabout in Austin, Texas, so we built one here so the autopilot could learn to handle it.” Villegas says.



                                             
Two-lane roundabout by Alexis Madrigal

On the tour, Mr. Villegas looked out over a new site: a dual driveway and an automatic driveway with a parallel parking area and an adjacent lawn. It’s a lot like the road scene we take for granted in the city in our daily life. “I really like to see sites with parallel parking. It’s very common in suburban business areas such as Walnut Creek, Mountain View and Palo Alto.”


Back on the main park, Villegas took me into a Self-driving Chrysler Pacifica. Brandon Cain was in the driver’s seat, and an officer in the passenger seat used software called XView to monitor the vehicle’s progress on a laptop computer.

Foxes, for example, call them foxes, a variation on the term Faux. Assistants drive cars, act as pedestrians, ride bicycles and carry stop signs. They are the actors, and the audience is the self-driving car.

The first test we did was simple travel and emergency braking, but at 45 miles per hour. At first we drove in a straight line on a wide road.

After the test assistant suddenly blocked the road, the self-driving car braked, and the test team checked the key data point: deceleration. They’re trying to create scenarios that cause cars to slam to a halt. How urgent is it? The screeching brakes were enough to make my armpits sweat and my phone slip to the floor.

Even more incredible, this wasn’t my first test ride in a self-driving car. I’ve been in a self-driving car twice before. The first was a Lexus SUV that drove me through the streets of Mountain View. The second was a few laps around Google in the company’s adorable Firefly self-driving car. The point is that neither experience was special.

But this experience was very different. This time there were two fast-moving vehicles, one of which was scheduled to overtake us in a way that was “exciting,” as Waymo put it.

The real test begins. Cain started the car, and the car beamed “Autodriving”. Soon another car began to approach and overtake in a dangerous manner. The self-driving car I was in applied a quick, fluid emergency brake. That struck me.

Testers immediately checked the deceleration parameters and found that the braking force was still not strong enough and had to be retested. And then it was tested over and over again. Overtaking vehicles use different approaches from different angles. They call it “covering all possible scenarios.”



                       
Two cars parallel at high speed, one of them self-driving (Alexis Madrigal)

We also went through three other tests:

  • High-speed parallel;

  • When the self-driving vehicle’s view was blocked, it happened to encounter a car reversing in the lane;

  • When the pedestrian throws the basketball in front of the car, the car slides smoothly to a stop.

Each test was impressive in its own way, but the overtaking test was the most memorable.


Cain turned to me as we waited for another test and asked, “Have you seen Pacific Rim?” In Guillermo del Toro’s film, the main characters fight in sync with giant robotic armor. “I’m trying to synchronize with the car and share some ideas with it.”

I asked Cain to explain exactly what he meant by synchronizing with the car. “I’m trying to adjust for the weight difference of the people in the car.” “Sitting in the car for so long, I can feel what the car is doing with my butt,” he said. As strange as it sounds, I kind of know what the car wants to do.”

Unlike Castle’s sun-baked office environment, Google’s mountain View headquarters is cozy. I visited the engineers at Waymo, actually the engineers in the X division. The Google X Lab is known as the part of Google that does long-term, high-risk research. In 2015, when Google was reorganized into Alphabet, the X division split from Google and their site is now X.com Pany. After a year of restructuring, Alphabet decided to bring the self-driving car project under its own umbrella, creating an entity called Waymo, so Waymo is like Google’s new son.

Waymo’s office is still in Google’s headquarters, but as far as I can tell, the Waymo division doesn’t have much interaction with other divisions these days.

Waymo’s office building is spacious and airy. Project Wing drone prototypes hang around the building. I also saw several of the company’s Fireflies.

Walk up from the cafeteria and you’ll see Waymo’s mock team offices in a corner of the building’s wing. Here, everyone’s screen displays Carcraft and XView. The black background is interspersed with numerous polygons. They created the virtual world in which Waymo cars were tested.



Images detected by Waymo’s self-driving car’s laser scanner as four people push a car in front (Waymo)


My host was Carcraft’s creator, James Stout, who had never spoken publicly about the project before. During the chat, we found that he was very enthusiastic about the project.


“I was just browsing through job postings and saw that the self-driving car team was hiring.” “I can’t believe they’re hiring right now,” he said. So he joined the team and immediately started building Carcraft, which now does 8 million virtual miles a day.

At the time, they were primarily using the system to test what self-driving cars would do when faced with the same tricky situations that a human driver would face in a car. They began to create scenarios based on these situations. “Very quickly, we found that the system was very useful and we could build a lot of scenarios with it.” Stout said. Now Carcraft’s simulation has expanded to include entire cities, and the number of virtual cars has become huge.

Stout then recruited Elena Kolarov, who was appointed head of the “scene maintenance” team to control the event. There were two screens in front of her. The right screen runs XView to show what the car is “looking at”. Driverless cars are equipped with cameras, radar and laser scanners that can identify objects in their field of view. In software, these objects are represented by small wireframes that outline the real world.

The green lines emanating from the wireframes represent the likely movement path of the object predicted by the car. At the bottom is an image bar that displays images captured by the car’s conventional (or visible) camera. In addition, Kolarov can view data from a laser scanner (LIDAR), which is displayed as orange and purple dots.

First, we watched a parallel replay of the cars that took place on Castle’s roundabout. Kolarov then switches the software to a simulation, which looks the same as the real world, but is no longer a data log, but a new scenario the car must solve. The only difference is that the XView screen has the word “simulation” in large red font at the top. Stout says they had to add the mark because people confuse simulation with reality.



Castle’s roundabout in XView simulation (Waymo)

Then they loaded another scene. This scene is in Phoenix. Kolarov shows scaling down to show us the entire virtual model of Phoenix. “For the city, they’ve collected” everything they need to know about where all the roads are, which roads lead to other lanes, where stop signs are, where traffic lights are, where sidewalks are, where the center of lanes is.” Stout said.




Waymo’s virtual model of Chandler, Arizona, for its self-driving cars (Waymo)

We zoom in and focus on a four-way parking lot near Phoenix. Kolarov started adding synthetic cars, pedestrians, and cyclists to the model.




Create a composite scene in Carcraft (Waymo)

Click the shortcut key and the object on the screen starts to move. Cars and bicycles move and turn in their own lanes. Their logic was modeled on the team’s millions of road test miles. Underneath it all, they created super-specific maps of the real world and models of the entities represented by the different objects in the screen. They modeled both rubber and road surfaces.




Xview Simulated scene (Waymo)

The hardest part is simulating the behavior of others. It’s like the old saying, “It’s not your driving I’m worried about, it’s everyone else on the road.”


“Our cars don’t just see the world, they understand it. Our cars understand the intentions of any dynamic element in the environment: a car, a pedestrian, a cyclist, or a motorcycle. Of course, tracking objects through space is not nearly enough. You have to understand what it’s doing.” Dmitri Dolgov, Waymo’s head of software, told me, “This is the key to building a safe and reliable self-driving car. This modeling and understanding of the behavior of other actors in the environment is very similar to the task of modeling them in a simulated environment.”

The key difference is this: in the real world, we have to take real-time data about our environment and turn it into an understanding of the scene before we navigate. But now, after years of work on the project, Waymo is confident it can do just that, because “a series of test results show that our system can identify a wide variety of pedestrians,” Stout said.

So in most simulations, they skip the object recognition step. Instead of giving the car raw data to identify pedestrians, they tell the car: there’s a pedestrian.

At the same four-stop intersection, Kolarov is giving driverless cars a harder task. She pressed V, a shortcut used to synthesize cars, and a new target appeared in Carcraft’s image. Then she slides to a drop-down menu on the right that offers a bunch of different models to choose from, including my favorite, the Bird_Squirrel.

They can command different objects to follow the logic Waymo has modeled for them, or the Carcraft scene generator can program those objects to move in an exact way in order to test specific behavior. “There’s a big difference between controlling a scene and adding objects and letting them do what they want.” Stout said. Once the basic architecture of the scenario is built, they can test all the important variations of that scenario. So imagine an intersection where you might want to test the arrival times of different cars, pedestrians and cyclists, how often they stop, and how fast they’re moving, among other factors. As long as they set a reasonable range for these values, the software can create and run all combinations of these scenarios.

They call it “blurring.” By doing so, 800 scenarios can be generated based on this four-fork parking junction. The software produces a beautiful mesh diagram that engineers can use to see how different combinations of scenario variants change the path the car decides to take.



                                             
Carcraft “Blurring” charts (Waymo)

At that point, the problem turns to analyzing those scenarios, as well as simulation diagrams, to find data that can guide engineers to improve the driving ability of autonomous vehicles. The first step might be, “Did you get stuck in traffic? If so, it would be an interesting scenario.”


The following GIF illustrates exactly such a scenario. It simulates a complex, real-life four-fork parking junction in Mountain View. When the car turned left, a bicycle appeared in front of it and the car had to stop. Engineers analyzed the problems and retuned the software to get the right results. The GIF shows the real scene, then the simulated scene. When the two scenes separate, you will see the simulated car continue driving and then a dotted box with the words “Shadow_VEHICle_pose.” This is the real situation. To the people at Waymo, this is the clearest visual illustration of the process.



Waymo simulation showing improved vehicle navigation (Waymo)

But they don’t just look for problems when cars get stuck. They may want to find decision times and brake contours that are beyond the appropriate range. For problems that engineers are trying to learn or debug, they use simulations to find problems.


Stout and Dolgov, head of Waymo’s software, stress that the simulation has three core characteristics: First, the car in the simulated environment has a lot more mileage than the real car — and therefore a good driving experience. Second, the simulated test mileage is mainly for fun and difficult car driving scenarios, not cookie-cutter scenarios. Third, the software development cycle will be shorter.

“The iteration cycle was important to us, and the work we did on the simulation allowed us to shorten it significantly.” “What might have taken weeks in the early days of a project now takes minutes,” Dolgov said.

Dolgov was upbeat when asked if he had simulated an oil slick, a flat tire, a bird strike, a pothole the size of a sinkhole and so on. That’s certainly something they can take into account, he says, but “how accurate is the simulator going to be along the axis? For some problems, you might be able to get a better value by running a series of tests in a real world scenario or getting confirmation from a simulator.”

The power of Carcraft’s virtual worlds is not that they build perfect, lifelike simulations of the real world, but that their real-world simulations are important to the development of self-driving cars, with billions of miles more test miles than actual testing will allow. While the driving software in the simulation didn’t make decisions in the real world, the virtual and real-world decisions were made in the same way.

And it is already working. The California Department of Motor Vehicles (DMV) requires companies to report their self-driving miles and test drivers’ discommunication (when a human driver takes over) annually. Not only was Waymo driving three orders of magnitude more miles than anyone else, the data showed, but the number of autopilot disengagements was also falling fast.

From December 2015 to November 2016, Waymo logged 635,868 miles of autonomous driving. In all the miles, the autopilot mode made only 124 detachments, an average of one detachment per 5,000 miles, or 0.2 detachments per 1,000 miles. Last year, they drove 424,331 miles and had 272 disengagements, an average of one per 890 miles, or 0.8 disengagements per 1,000 miles.

While many people point out that these are not exact numbers, these are the best comparisons in California, at least with everyone driving around 20,000 miles.”

Outside experts are not surprised by Waymo’s approach. “Right now, you can almost measure how advanced a team of automation (drones, self-driving car projects) is by how seriously they take simulation,” says Chris Dixon, a venture capitalist at Andreessen Horowitz. Waymo’s technology is undoubtedly the best and most advanced.”

I also asked Sunil Chintakindi, head of innovation at Allstate Insurance, what he thought of Waymo. “You simply can’t achieve the high level of automation in a car without a strong analog infrastructure,” he says. I’m not going to have a conversation with someone who thinks otherwise on that point.”

Meanwhile, other self-driving car researchers are pursuing similar approaches. Huei Peng is the director of the Autonomous – and Connected-Vehicle Lab at the University of Michigan. Peng said that any system suitable for an autonomous vehicle would be “more than 99% simulation + well-designed institutional testing + road testing”.

Peng and one of his graduate students have come up with a system that uses simulations tightly tied to highway mileage to dramatically accelerate test speeds. This is not the same as the Waymo project. “What we’re talking about is getting rid of the boring part of driving and focusing on the fun part.” “That’s hundreds of times faster: a thousand miles can become a million.”

What is surprising is the size, organization and intensity of Waymo’s project. When I described to Peng the structured testing Google has done, including 20,000 scenarios from Castle’s structured testing team that have been used for simulation tests. Peng didn’t hear it clearly at first and thought it was just 2000 scenes. And when I found this and corrected him in time, saying “20,000”, Peng stopped and thought for a long time, then said, “This is amazing.”

In fact, those 20,000 scenarios are just a fraction of the total that Waymo tested. They are just scenarios derived from structured testing. It is understood that the total number of scenarios is far greater than the number of scenarios created through public driving and imagination.

“They’re doing really well,” Peng said. “They’re way ahead of everyone else in Level 4 autonomous driving.”

But Peng also points to the position of traditional automakers, which are trying to do something completely different. “They are trying to promote driver assistance technology rather than committing themselves to fully autonomous driving technology.” Make some money “and continue to work toward fully automated driving. It’s unfair to compare these automakers to Waymo. That’s because Waymo, in addition to having the resources and the corporate cash to back it up, can “willfully” put a $70,000 laser rangefinder on a car that, like Chevrolet, might have a mass-market price cap of $40,000.

“Automakers like GM, Ford and Toyota may say ‘we’re going to reduce accidents and fatalities and make mass-market vehicles safer,’ but their goals are completely different.” “We need to consider millions of vehicles, not just thousands of cars,” Peng said.

Even in fully autonomous races, Waymo has a lot more challengers than it used to, like Tesla. Chris Gerde is director of the Center for Automotive Research at Stanford University. Eighteen months ago, he told my colleague Adrienne LaFrance that Waymo “has a deep insight into the depth of the problem and how close we are to solving it.” “A lot of things have changed,” he said when I confirmed that with him last week.

Their task now is to make driving a human social activity.

“Automakers like Ford and GENERAL Motors have deployed their vehicles and built on-road test data sets,” Gerde said. Tesla has now collected a lot of data from Autopilot deployments to understand how the system works precisely in the conditions that customers experience. Not only can they test their algorithms in silent mode, but they can also rapidly expand their database of vehicles. With these two capabilities, Tesla has built an amazing testing platform.”

As for simulation, Gerde said he has seen several competitors working on substantive projects. “I’m sure there are a lot of simulations out there, and I’ve seen some that look good. Waymo is no longer alone in this. While they certainly have the lead, there are many teams looking at similar approaches. So it’s a question of who can do it best.”

Autopilot systems are not low-risk proof of the “brain-like” capabilities of neural networks. It’s a leap forward in ARTIFICIAL intelligence, even for Waymo, which has been aggressively adopting AI. Unlike Google Photo, it doesn’t matter if you make a mistake. It is a system that can exist and interact completely autonomously in the human world. It understands our rules, communicates its ideas, and is clearly perceived by our eyes and minds.

Waymo seems to view driving as a technology, controlling speed, direction and so on. Now, their task is to treat driving as a human social activity. What does it mean that a car can be driven “normally”, not just “legally”? How can humans guide artificial intelligence to understand this problem?

Building this kind of AI, it turns out, requires not only reams of data and engineering expertise, but also synchronization between humans and cars, enabling them to understand the world as well as humans. The drivers of Castle know how to observe the environment and make decisions like a car, and others can do the same. Perhaps the understanding works both ways: the more people understand cars, the more cars understand people.

A memory of Oster Roundabout was turned into a Castle test site, then into a self-driving car data log, then into a network of simulations, and finally into new software that was downloaded to the physical self-driving car to guide it around the roundabout in Austin, Texas.

Even in the polygonal abstract patterns in simulation systems, the tools AI uses to understand the world, we can find traces of human dreams, fragments of memories, feelings of driving a car and so on. These components are not something that can be erased, they are essential components of autonomous driving systems that can revolutionize transportation, cities, and everything.


C. author | Alexis Madrigal

Original address:

https://www.theatlantic.com/technology/archive/2017/08/inside-waymos-secret-testing-and-simulation-facilities/537648/