5

Did Uber Steal Google’s Intellectual Property? (That doesn't matter the rest of the article is a journey) newyorker.com

Submitted by devtesla in technology

It's buried but this is extremely important:

Some of the biggest fights involved risks that Levandowski was taking in self-driving experiments. The software that guided Google’s autonomous vehicles improved by ingesting immense amounts of test-drive data. One effective way to teach autonomous vehicles how to, say, merge onto a busy freeway is to have them do so repeatedly, allowing their algorithms to explore various approaches and learn from mistakes. A human “safety driver” always sat in the front seat of an autonomous vehicle, ready to take over if an experiment went awry. But pushing the technology’s boundaries required exposing the cars’ software to tricky situations. “If it is your job to advance technology, safety cannot be your No. 1 concern,” Levandowski told me. “If it is, you’ll never do anything. It’s always safer to leave the car in the driveway. You’ll never learn from a real mistake.”

One day in 2011, a Google executive named Isaac Taylor learned that, while he was on paternity leave, Levandowski had modified the cars’ software so that he could take them on otherwise forbidden routes. A Google executive recalls witnessing Taylor and Levandowski shouting at each other. Levandowski told Taylor that the only way to show him why his approach was necessary was to take a ride together. The men, both still furious, jumped into a self-driving Prius and headed off.

The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

The Prius regained control and turned a corner on the freeway, leaving the Camry behind. Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident.

Levandowski, rather than being cowed by the incident, later defended it as an invaluable source of data, an opportunity to learn how to avoid similar mistakes. He sent colleagues an e-mail with video of the near-collision. Its subject line was “Prius vs. Camry.” (Google refused to show me a copy of the video or to divulge the exact date and location of the incident.) He remained in his leadership role and continued taking cars on non-official routes.

According to former Google executives, in Project Chauffeur’s early years there were more than a dozen accidents, at least three of which were serious. One of Google’s first test cars, nicknamed kitt, was rear-ended by a pickup truck after it braked suddenly, because it couldn’t distinguish between a yellow and a red traffic light. Two of the Google employees who were in the car later sought medical treatment. A former Google executive told me that the driver of the pickup, whose family was in the truck, was unlicensed, and asked the company not to contact insurers. kitt’s rear was crushed badly enough that it was permanently taken off the road.

In response to questions about these incidents, Google’s self-driving unit disputed that its cars are unsafe. “Safety is our highest priority as we test and develop our technology,” a spokesperson wrote to me. The company said that, in the case of the kitt collision, a report was submitted to the authorities, and that although multiple participants later sought medical care, “every person involved left the scene on their own accord.” As for the Camry incident, the spokesperson described it as “an unfortunate single-car accident in which another car failed to yield to traffic”; because Google’s self-driving car did not directly hit the Camry, Google did not cause the accident.

Less important but still amazing:

Lawyers later learned that, around the same time, an engineer who had left with Levandowski, Lior Ron, had conducted Internet searches for “how to secretly delete files mac” and “how to permanently delete google drive files from my computer.”

Comments

You must log in or register to comment.

2

voxpoplar wrote

every paragraph I read makes me hate silicon valley more